HMRC's Latest Moves: What's Changing for Your Taxes and Online Filings

author:Adaradar Published on:2025-11-26

Generated Title: HMRC's AI Gamble: Efficiency Boost or Justice Black Box?

The Algorithm in the Courtroom

Judge Christopher McNall's decision to openly use AI in a tax tribunal (Evans & Ors v Revenue & Customs Commissioners [2025] UKFTT 1112 (TC)) is either a landmark moment for efficiency in the justice system, or a step towards a less transparent legal process. It really depends on how you look at it. The case itself, dealing with offshore trust arrangements and double-taxation treaties, was deemed "well suited" for AI due to its reliance on written submissions and lack of witness testimony. That's a reasonable starting point.

The UK courts aren't exactly subtle about their intentions either. Updated guidance from October 2025 explicitly encourages judicial office holders to use AI responsibly, with an emphasis on independent verification and a strict prohibition against inputting case-related data into public tools (a reasonable fear, given the… let's call them “creative” outputs of some public AI models).

But here's the rub: transparency. Judge McNall should be commended for disclosing his use of AI. But how granular was that disclosure? Did he detail the specific prompts used, the AI model employed, and the steps taken to validate the AI's output? The article doesn't say. And that's a problem.

We're told that Sir Geoffrey Vos, Master of the Rolls, acknowledges AI's potential to deliver court decisions "in minutes." That sounds great on the surface, especially given the glacial pace of some legal proceedings. But he also raises a critical question: can AI replicate human "emotion, idiosyncrasy, empathy and insight?" It's a valid point. A purely data-driven decision, while potentially faster, might lack the nuance that a human judge brings to the table.

HMRC's Latest Moves: What's Changing for Your Taxes and Online Filings

Digging Deeper: The HMRC Angle

The broader context here is HMRC's ongoing push for digitalization. They're pushing digital tax self-assessment, updating guidance for digital filings, and generally trying to drag the tax system into the 21st century. (A noble goal, I suppose). They're even going after people who haven't claimed their Child Trust Funds—apparently, hundreds of thousands are sitting unclaimed, worth around £2,212 on average. HMRC wants people born before 2011 to claim cash

But there's a less rosy side to HMRC's tech adoption. They're reinstating a process to directly dip into bank accounts to recover debts (the direct recovery of debts, or DRD). This targets debtors with £1,000 or more in unpaid taxes who have ignored appeals and contact attempts. Crucially, they claim to leave a minimum of £5,000 in the account after recovery.

Now, HMRC insists there are "safeguards" in place for vulnerable debtors. They allow appeals, offer in-person visits, and discuss payment plans. But how effective are these safeguards in practice? And how will AI be used in the DRD process? Will algorithms be used to identify potential targets, assess their ability to pay, or determine the appropriate recovery amount? The lack of clarity is concerning.

I've looked at hundreds of these policy announcements, and the consistent theme is efficiency and cost-cutting. In 2023/24, the cost of collecting £1 of inheritance tax (IHT) was 0.67p—just 0.02p less than income tax. The implication is clear: HMRC wants to make tax collection cheaper. AI, in their eyes, is a tool to achieve that goal. But at what cost to fairness and due process?

The Algorithm's Verdict: Proceed with Extreme Caution

HMRC's embrace of AI is a double-edged sword. The potential for increased efficiency and reduced costs is undeniable. But the lack of transparency and the potential for algorithmic bias raise serious concerns. We need clear guidelines, robust oversight, and a commitment to ensuring that AI is used to enhance justice, not simply to automate it. Otherwise, we risk creating a system where decisions are made by black boxes, and accountability is lost in the code.