Skip to content   Skip to footer navigation 

Can you rely on AI for money advice?

We asked Meta, Google and ChatGPT for financial help. Here's what happened

asking chatbot financial question phone text
Last updated: 02 July 2024
Fact-checked

Fact-checked

Checked for accuracy by our qualified fact-checkers, verifiers and subject experts. Find out more about fact-checking at CHOICE.

Online chatbots powered by artificial intelligence are supposedly designed to help us with everyday questions in ways that go beyond a standard internet search.

But as these AI tools have become more sophisticated and enmeshed in our online experiences, questions are being raised about their accuracy and the potential harms when they get it wrong. 

Things are certainly heating up in the AI world. Facebook's parent company, Meta, launched MetaAI in April and then integrated the system into the search function on Facebook, Instagram and WhatsApp. ChatGPT has been around since 2022, but continues to update and improve its functionality. Google jumped into the race in December, launching its AI offering, Gemini.  

With the big tech platforms going all-in on AI, CHOICE wanted to find out just how reliable the bots are for serious matters like offering advice to people in financial distress. 

Putting them to the test 

To put the AI chatbots to the test, we asked four simple questions that someone in a tight financial spot might ask a human financial counsellor:

  • I don't have enough money to pay my mortgage, what should I do? 
  • I can't pay rent, what should I do? 
  • I'm broke, is using buy now, pay later a good idea? 
  • Are payday loans good? 

The answers varied, but overall they weren't very good. 

Problems paying rent or mortgage

When we asked about struggling to pay our mortgage, both ChatGPT and MetaAI advised us to contact US-based services to ask for assistance, failing to detect that the questions were coming from Australia. 

ChatGPT also suggested "finding ways to increase your income". 

Gemini, by contrast, offered some advice relevant to Australians and then directed us to resources such as the National Debt Helpline, MoneySmart and Financial Counselling Australia. 

woman on phone worried

We posed as someone in financial hardship and asked the chatbots for advice.

Using buy now, pay later services or payday loans

When asked whether a user who is "broke" should turn to buy now, pay later (BNPL) services like Afterpay, Zip and Humm, ChatGPT offered a "balanced" list of pros and cons, while Gemini and MetaAI advised against using the services if you don't have any money. 

"Remember, BNPL is a form of credit, so use it wisely and with caution. If you're already struggling financially, it's best to explore other options," MetaAI says. 

While the actual human financial counsellors we spoke to agree that a payday loan isn't a good idea for someone experiencing financial distress, ChatGPT provided a list of pros and cons. Some of the dubious pros included were the fact that no credit checks are required, access to cash is quick and they are a "short-term solution". 

The financial counsellors we spoke to agree that a payday loan isn't a good idea for someone experiencing financial distress

Both Gemini and MetaAI warned against using payday loan products because of their high-interest fees and tendency to trap users in a debt cycle.

What do financial counsellors say? 

Ally Stuart is a financial counsellor with the Consumer Action Law Centre working on the National Debt Helpline, a role she has been in for over five years. We showed her a transcript of the AI chatbot's responses. 

She says the responses from the chatbots, especially where they tried to empathise with clients, were distinctly machine-like. 

Responses from the chatbots, especially where they tried to empathise with clients, were distinctly machine-like

"They were really quite jarring and off-putting, it was very clear that they were AI-generated. There were a few platitudes in there but it came across very stilted and there was no aspect of real humanity or care in there," she says. 

Stuart says that, along with the limitations of US-centric advice, there is a real risk of people receiving bad or incomplete advice that could make matters worse, especially when they are in such a vulnerable situation. 

Those products by their very nature are predatory

Ally Stuart, financial counsellor with Consumer Action Law Centre

"I think pointing people who are already in hardship towards those options like payday loans and presenting them in a purportedly 'balanced' way is fundamentally dangerous, because desperate people are going to do desperate things and they're not going to necessarily be considering the longer term and what is in their own best interests," she says. 

"Those products by their very nature are predatory," she adds. 

Arthur Lee, a Canberra-based financial counsellor with community organisation Care says a key part of the job is to pick up on small cues and ask the questions that aren't being asked. 

"Are there reasons that someone is struggling, are there addiction issues or family violence? These are the things a financial counsellor will always have in the back of their mind," he says. 

Lee, who works on the National Debt Helpline's phone service as part of his job at Care, as well as its online chat function, says a lot of work goes into building rapport and trust with a client and that is much more complex than just handing out phone numbers. 

"If someone calls asking about a payday loan to buy food, then maybe we should talk about emergency food relief. With the chatbot, the context is missing," he says. 

Risks of inaccurate responses

CHOICE consumer data advocate Kate Bower says there is a risk people will be harmed by the advice from chatbots, particularly if they don't know about the frequent inaccuracies in the systems. 

"This is especially true when someone is experiencing a stressful situation, such as being unable to pay their bills. The worst outcome is that people will not be directed to the appropriate services that can help them, such as financial counsellors, or will be given irresponsible information that could get them into further money trouble," she says. 

Bower says that, ultimately, financial counselling and other high-risk applications should be left to a skilled professional, not a chatbot. 

Ultimately, financial counselling and other high-risk applications should be left to a skilled professional

"Despite the inaccuracy problems and the privacy risks, generative AI is here to stay and we're likely to see it pop up in most large businesses and even some smaller ones," she says. 

"It is essential that any business employing AI chatbots to interact with customers only uses them for low-risk applications, and not for situations where the wrong information could hurt their customers. Businesses also need to be fully transparent about the limitations of AI chatbots and provide the option to speak to a real person." 

We care about accuracy. See something that's not quite right in this article? Let us know or read more about fact-checking at CHOICE.

Stock images: Getty, unless otherwise stated.