Emily Nash-Walker, Sr Director of Product Strategy at Tungsten Automation on finding real value for AI across financial services

The Bank of England has recently sounded the alarm of a potential AI bubble looming. Experts are calling out clear parallels with the dot-com boom, such as over expectations on the tech, huge investment, and limited returns or focus on value addition. In the financial services sector, where innovation and risk are no strangers, the Bank of England’s warning couldn’t be more relevant.

Since the launch of ChatGPT, financial services and FinTech firms have dedicated unprecedented time and money to AI. From LLMs to predictive analysis and AI Agents. However, underneath the rapid adoption we see, there is rising tension between experimentation and governance.

Shadow AI

Many FinTechs and traditional financial services firms are now working on “shadow AI” (internal systems developed without formal oversight, transparency, or risk management), creating a sort of AI “grey market”. This new market offers huge innovation, but without being managed properly, it undermines key governance, and in the fintech space, this means risking consumer data, consumer confidence, and ultimately trust. If left unchecked, this could trigger the industry’s next big credibility crisis and expose them to the next big financial crisis.

AI Overextension

AI can have huge transformative effects on financial services and is at the forefront of changing the industry for the better. From fraud detection to customer service automation, there’s no doubt that AI has changed how institutions engage, analyse, and operate for the better.

But the industry’s eagerness to innovate quickly has led to a familiar problem: overextension. According to MIT research, 95% of GenAI pilots never reach production. Meanwhile, McKinsey estimates that AI technologies could potentially deliver up to $1 trillion of additional value each year if they are implemented effectively. But that is a big “if”.

Right now, too many organisations are focused on experimentation in isolation, often in siloed AI labs. Where AI tools are being built by small internal teams without full visibility or awareness from compliance or IT departments. Algorithms are being trained on partial or poor-quality data. And models are being deployed without clear documentation of how they make decisions. More than 81% of financial compliance experts are concerned about the accountability and explainability of AI-driven decisions. Fundamentally lacking the accountability and explainability that should underpin AI that drives real, low-risk value for businesses.

Dangers of the AI Bubble

If the AI bubble bursts, it won’t be because of the technology. It will be because of how it’s being applied. And the more experiments an organisation invests in without real value being shown, the more they will be exposed to the effects when it pops.

As the bubble grows, so does “Shadow AI”. The pursuit of innovation across sectors leads to siloed teams investing quickly but often without the right guardrails.

Shadow AI shows many similarities to the early days of the cloud era, when employees adopted unsanctioned tools to move faster than IT could keep up, leaving organisations fragmented and exposed to risk. Innovation is as essential or even more essential than it has ever been, but this idea of fragmentation is also more of a risk now than it has ever been.

In financial services, the implications are far more serious than in most industries. Consider the risks if a credit-scoring model built without audit trails begins making biased decisions. Or if a KYC automation tool fails to detect a sanctions breach because it’s running on unvalidated data. And banks built on shadow AI lack the explainability to know, let alone test or assure these models.

AI Governance

FinTech success depends on reliability, transparency, and data integrity. Once those foundations erode, rebuilding them becomes far harder than any technical fix. The solution isn’t to slow down innovation. It’s to govern it properly.

The whole industry needs to move beyond AI experimentation toward governed automation. Integrating AI responsibly into existing workflows, supported by clear oversight, robust data management, and explainable outcomes, has to be the priority.

Smart businesses are focused on AI for the right reasons. It means focusing on what’s needed, practical and measurable instead of chasing ideas of what you could potentially do. Organisations need to be aware of the hype and focus on systems that deliver compliance, accuracy, and ROI.

Financial services have always had challengers in the sector pushing boundaries with new tech, and this has never been so true. It’s an industry that has always spent a lot of time focused on hype. But this next phase of innovation, specifically AI adoption, will see winners prioritising something different. Patience, precision, and accountability will win over efficiency, new features, and speed.

Heeding the Warnings

As the Bank of England has warned, overinvestment and complacency when it comes to defining and reporting concrete value may be creating a big bubble primed to pop. To prevent or limit exposure, leaders should ask three business-critical questions before plunging more investment into AI:

  • What business problem are we solving?
  • Is our data structured, accurate, and governed?
  • Can we measure the outcome and explain the result?

If the answer to any of these is uncertain, the risk is also uncertain. The danger with shadow AI is that often the answer to all 3 is opaque and unclear. AI’s potential in financial services remains enormous. But true intelligence doesn’t come from the newest model or the biggest dataset. It comes from disciplined execution.

When the hype fades, the organisations that endure will be those that integrate AI responsibly, manage data intelligently, and put compliance at the core of innovation.

As with the dotcom boom and many other technological revolutions, the question isn’t whether AI will reshape the sector; it’s who will still be standing when the dust settles. The difference will come down to who governs their AI with a focus on real value versus those who chase experimental AI without true accountability.

Learn more at tungstenautomation.com

  • Artificial Intelligence in FinTech

We believe in a personal approach

By working closely with our customers at every step of the way we ensure that we capture the dedication, enthusiasm and passion which has driven change within their organisations and inspire others with motivational real-life stories.