International Finance
Banking and FinanceMagazine

Will ChatGPT be the new private banker?

ChatGPT

Artificial intelligence (AI) has become the new normal in the 21st-century global socio-economic order. Ever since OpenAI came out with its ChatGPT chatbot in late 2022, the tool has been doing anything and everything: be it summarising books and texts (for artistic or creative input), drawing pictures, data analysis, API integration, customer service, language comprehension, or building resumes. Generative language models are slowly becoming a part and parcel of every sector, including banking.

How does banking sector adopt AI?

Banks have accelerated their AI research and use cases due to the rise of ChatGPT. Legacy institutions are also facing the heat from the fintechs, which are deploying state-of-the-art AI-backed models to take care of functions like customer service, fraud detection, and automation of repetitive tasks. As pioneers in the digital revolution, fintech companies were among the earliest adopters of AI to support financial services and operations.

Now, feeling the heat from these new players, legacy financial institutions like JPMorgan Chase, Bank of America, and Goldman Sachs are going ahead with the technology to reduce costs, boost efficiency, compliance, personalised service, predictive analytics, and increased competitive advantages.

Technologies like machine learning, data analysis, natural language processing (NLP), and computer vision are now widely used in the financial sector when it comes to understanding the financial behaviour and requirements of customers before offering them tailored products.

Talking about banking industry biggies adopting AI in their operations, Morgan Stanley in June 2024 launched its “AI @ Morgan Stanley” suite of GenAI tools for Financial Advisors (FAs). The OpenAI-powered tool, with client consent, generates notes on a Financial Advisor’s behalf in client meetings and surfaces action items. After the meeting, it summarises key points, creates an email for a Financial Advisor to edit and send at their discretion, and saves a note in Salesforce.

Another Wall Street biggie, Goldman Sachs, has taken a measured approach in adopting AI. As of March 2025, half of the 46,000 employees at the investment banking giant have access to the technology. The firm is currently experimenting with agentic AI, which has yet to be deployed across the firm, despite the apparent benefits in the automation of key tasks such as compliance checks or the processing of customer transactions.

Broader use of generative AI within the company came with the launch of GS AI Assistant, which rolled out in 2024 and has been expanded to 10,000 employees, including bankers, traders, and asset managers. This tool, which Goldman Sachs anticipates will be available to nearly all employees by the end of 2025, can summarise documents, draft emails, analyse data, and create personalised content.

However, it is JPMorgan Chase that has been aggressive in adopting AI in its operations. The bank has taken a top-down approach to adoption, adding Chief Data and Analytics Officer Teresa Heitsenrether to its technology leadership team in June 2023 and putting an AI assistant called LLM Suite in the hands of 140,000 employees by October 2024. The company also rolled out ChatCFO, a generative AI tool for finance teams, apart from implementing prompt engineering training for new hires.

Now, a very compelling question: Can banks use ChatGPT or, in that sense, any other AI chatbot as a personal finance advisor?

Discussing the possibility

In 2024, two universities in the United States analysed more than 10,000 responses to financial exam questions from large language models such as ChatGPT and Bard. They found that AI is not likely to replace human advisors any time soon.

However, the Washington State University and Clemson University study asked the AI tools to provide the reasons behind the answers and compared the responses with those from human advisors. While they found that two versions of ChatGPT performed the best (particularly 4.0, a paid version), there was inaccuracy when topics were more advanced.

The AI responses were best for questions around securities transaction reviews and monitoring market trends, but it struggled with areas such as client insurance coverage or tax status.

Study author DJ Fairhurst of WSU’s Carson College of Business, said, “It’s far too early to be worried about ChatGPT taking finance jobs completely. For broad concepts where there have been good explanations on the internet for a long time, ChatGPT can do a very good job at synthesising those concepts. If it’s a specific, idiosyncratic issue, it’s really going to struggle.”

To prove their point, Fairhurst and co-author Daniel Greene of Clemson University used questions from licensing exams, including the Securities Industry Essentials exam, as well as the Series 6, 7, 65, and 66.

To move beyond the AI models’ ability to simply pick the right answer, the researchers asked the models to provide written explanations apart from choosing questions based on specific job tasks, financial professionals might actually perform.

Of all the models, the paid version of ChatGPT, version 4.0, performed the best, providing answers that were the most similar to human experts. Its accuracy was also 18% to 28% higher than the other models. However, things changed when the researchers fine-tuned the earlier, free version of ChatGPT 3.5 by feeding it examples of correct responses and explanations. After this tuning, the AI model came close to ChatGPT 4.0 in accuracy and even surpassed it in providing answers that were similar to those of human professionals.

“Both models still fell short, though, when it came to certain types of questions. While they did well reviewing securities transactions and monitoring financial market trends, the models gave more inaccurate answers for specialised situations such as determining clients’ insurance coverage and tax status. Fairhurst and Greene, along with WSU doctoral student Adam Bozman, are now working on other ways to determine what ChatGPT can and cannot do with a project that asks it to evaluate potential merger deals. For this, they are taking advantage of the fact that ChatGPT is trained on data up until September 2021 and using deals made after that date, where the result is known. Preliminary findings are showing that so far, the AI model isn’t very good at this task,” reported SciTechDaily in December 2024.

The researchers’ final verdict was that ChatGPT can be better used as a tool to assist rather than as a replacement for an established financial professional. On the other hand, AI may change the way some investment banks employ entry-level analysts, as Fairhurst said, “The practice of bringing a bunch of people on as junior analysts, letting them compete and keeping the winners – that becomes a lot more costly. So, it may mean a downturn in those types of jobs, but it’s not because ChatGPT is better than the analysts, it’s because we’ve been asking junior analysts to do tasks that are more menial.”

Weighing on the topic, Oliver Hackel, Senior Investment Strategist at Kaiser Partner Privatbank AG, said, “In any case, AI certainly doesn’t lack self-confidence, not even when it comes to crafting the right wording. This is demonstrated impressively when the chatbot is asked how Donald Trump would explain Bitcoin. You can hardly get the voice of the former US president out of your head afterwards. But are ChatGPTs from the US-based artificial intelligence research firm OpenAI or its numerous kin also suitable to act as investment advisors? Our virtual mystery shopping tour revealed that chatbots still lack the necessary financial education. Moreover, even more powerful generative language model versions in the future will not be capable of replacing intimate conversations between clients and advisors.”

First-hand experiences

Andrew Lo, director of the Laboratory for Financial Engineering at the MIT Sloan School of Management, sees LLMs (Large Language Models) like ChatGPT as “glorified search engines” that will excel in helping their users to find information fast, apart from being a good source of general advice on how to set up a budget or improve credit score. However, getting accurate answers on specific, sensitive financial questions is where the concerns start.

“Many AI platforms lack domain-specific expertise, trustworthiness, and regulatory knowledge, especially when it comes to providing sensitive financial advice. They might even lead individuals to make unwise investments or financial decisions,” Lo warned.

Still, as per an October 2024 Experian study, many Americans are already turning to AI chatbots for financial management help, and among the 47% who reported doing or considering the practice, 96% have a positive experience.

However, a new study from the broker analysis site, Investing in the Web, found that tools like ChatGPT might not be very good at the job. To prove their point, researchers asked ChatGPT 100 questions related to finance and then had the answers reviewed by industry experts at their company. In the report, AI responded to 35% of financial queries incorrectly, with one in three answers being hallucinated on questions centred on finances and investments.

In response to questions like “How [do I] save for my child’s education?” “How does the average pension compare to the average salary?” and “What are the pros and cons of investing in gold?” the chatbot answered 65% correctly, while 29% were labelled incomplete or misleading, and 6% were found to be completely incorrect.

Pedro Braz, CEO of Investing in the Web, said in a statement that it’s important to cross-check the sources (of the answers generated from an AI tool), especially with financial information that relies on timely data that is subject to change, such as interest rates and daily stock performance.

“ChatGPT has well-recognised issues with up-to-date information. It is best to go to the very source of the information, rather than asking AI chatbots for financial data,” he added.

As Hackel entered OpenAI’s virtual office and asked his first question regarding a suitable investment strategy, the chatbot started out by alerting him that it was not a certified investment consultant and could not give specific investment recommendations.

“But as is the case with so many other subjects, ChatGPT quickly sheds its restraint when we chat about a hypothetical example. Our query asks ChatGPT to construct for an investor with a moderate risk appetite a multiasset portfolio composed of 15 to 20 ETFs that outperforms a simple 50/50 portfolio of stocks and bonds over the long term. Within seconds, the advisory bot recommends a mix of low-correlated asset classes. Stocks, bonds, commodities, and alternative assets are just the ticket, the bot says, and it names corresponding ETFs,” he noted.

After a few more follow-up questions, Oliver Hackel and his team ended up with a portfolio of 25 ETFs that also incorporates small and midcaps, sector-based, factor-based, and thematic strategies as well as exposure to international markets alongside the United States in its equity component.

“The original portfolio also becomes broader and more diversified in its fixed-income component and its allocation to alternative assets in the course of the client advisory conversation. However, the electronic advisor seems a little overwhelmed by a sophisticated client like ourselves,” he noted.

The limits of AI in finance

While stating that AI-powered tools like Perplexity and ChatGPT can help people who are looking for advice on saving and budgeting, investment planning, and credit score improvement, Christina Roman, consumer education and advocacy manager at Experian, terms the technology a great starting point for consumers who otherwise might not be able to afford professional financial advice.

“I don’t think that this is going to make people reliant on AI for these types of services, but I think it’s a great tool that can help them to navigate their financial lives and to understand complex topics like investing and whatnot,” Roman said.

While each prompting experience will differ, an individual can provide relatively simple details about their financial situation, and generative AI can produce a fairly elaborate plan. It has to be well-crafted, something like this: “I need help managing my money. I make $50,000 a year. I have $10,000 in credit card debt on one credit card and $2,500 in debt on another credit card. My rent each month is $750. My car payment each month is $450. I have $150 in other utility expenses. I only have $250 set aside in my emergency fund. Can you help me get on track?”

When Fortune.com put these prompts inside ChatGPT, the AI tool provided separate sections like “Budgeting with a 50/30/20 Rule (Customised for You), including a monthly income estimate,” “Spending Breakdown,” “Debt Repayment Strategy: Snowball or Avalanche,” “Emergency Fund Goal,” “Budget Adjustments” and “Automate Payments and Savings and Example Action Plan for Next Month.”

So, using well-crafted prompts will become the guardrail here, apart from asking AI follow-up questions and adding more details about your financial situation and goals. It will only help the platform understand your unique circumstances and offer resourceful information.

However, Roman advises that people be very cautious with the output. AI platforms hallucinate, and that means the advice they offer may not be grounded in best practices, or even in any sound personal finance reality.

Furthermore, she advises being generic about the information they provide to any generative AI platform since the user may not realise the way his or her information will be saved or used to train the AI model itself.

Generative AI platforms are innovating by the minute, and there is no question that the ChatGPT of 2025 is more accurate and detailed than the version which came out a couple of years ago. But that is exactly what worries some financial experts. Hallucinations can be hidden in plain sight, and individuals without financial expertise or experience may not know the difference.

In Andrew Lo’s recent research paper on using generative AI for financial advice, he cited an example where ChatGPT 3.5 made up the author names for a paper it used to back up its responses. While this may not seem like a serious offence, when it comes to statements that involve financial risk, hallucinations could ruin someone’s finances.

“AI platforms may not always disclose as much source or background information as you might want or need. For example, when asking ChatGPT for investment advice, it recommends investing in companies like Microsoft. While human financial advisors may do the same, everyday users may not realise that Microsoft has invested over $13 billion in OpenAI, the parent company of ChatGPT. The chatbot does not note the conflict of interest to users unless it is pointed out,” Lo suggested.

Future of AI in financial services

Working with a human financial advisor allows for more conversation-based financial planning. Details about one’s financial status and goals can be discussed further to create a personalised plan that includes an understanding of all risks tied to potential money moves.

Generative AI will provide financial advice with just a few minor details, and individuals who take that insight without thinking (or even double-checking things) about their full financial picture could make costly mistakes.

Yes, LLMs are becoming more advanced, and one can imagine a future where generative AI is much more integrated into the financial advising ecosystem.

However, Michael Donnelly, the interim managing director of corporate growth at the CFP Board, says financial professionals may be more capable than others in making a strong case that technology cannot replace human advice. He engaged in a similar conversation a decade ago, during the rise of robo-advisors.

Donnelly advocates that financial advisors learn to accept AI as a tool that is great for things like internal practice management, as the technology will save advisors time better devoted to strengthening personal relationships, a hallmark of the financial planning profession.

“For consumers, AI won’t eliminate the need to work with a human financial planner,” Donnelly said, though Lo expressed concerns for those without access to dedicated financial advisors.

“We don’t have any guardrails yet in terms of how large language models are able to provide advice to consumers. And I think that on the regulatory front, we do need to have more careful guardrails, but on the research front, it really opens up a whole new set of vistas for us to explore,” Lo added.

Lo equated the situation to the fact that consumers largely have access to lower-risk mutual funds or money market accounts, but there are much greater regulations when it comes to who can deal with riskier private equity or hedge fund investments. AI, largely, has no guardrails.

He recommends a three-pronged approach to making AI’s role in finance safer. First, investors must be aware of AI’s tendency to hallucinate and ensure that any response generated by tools like ChatGPT is thoroughly fact-checked. Second, financial institutions adopting AI as a personal finance advisor must build safeguards to detect abuse and misuse. Finally, strong regulatory frameworks must take precedence to guide responsible use.

What's New

Almamoon Insurance Broker: Rewriting the rules of care

IFM Correspondent

AI pricing: A threat to consumer fairness

IFM Correspondent

Stability AI rewrites Hollywood’s rulebook

IFM Correspondent

Leave a Comment

* By using this form you agree with the storage and handling of your data by this website.