Once upon a time (or so we tell ourselves), reporting a suspicious neighbour to His Majesty’s Revenue & Customs (HMRC) was pitched as some kind of civic virtue, help us catch tax dodgers, protect public coffers. On paper, a noble cause. In practice? The first step on a rickety road that leads straight into Orwell-land, enter stage left HMRC’s ‘Strengthened Reward Scheme’.
Where to start? Let’s begin with the lawn-mowing neighbour. Under the whistle-blowing scheme, that neighbour who seems a little too prosperous, shiny car, new patio, perhaps an extra gardener working weekends, becomes a potential tax-dodger worth a tip-off. Add in financial reward for ‘helpful citizens’ and suddenly quiet cul-de-sacs start feeding on suspicion. Families, innocent people, quietly minding their own business become bothersome community allegations. HMRC winds up chasing ghosts, wasting public time and money, while real tax dodgers and thieves count their offshore millions. The noble goal of fairness? Subverted into a cottage industry of mistrust.
It does not end there. As part of this creeping expansion of public safety, we are now being told cameras, not just your doorbell-cam, but big CCTV networks with AI, might soon start reading your face, your gait, your tone, maybe even your emotion. Emotion recognition software claims to detect anger, stress, agitation or other suspicious expressions in crowds. Network Rail reportedly trialled such cameras at major UK train stations, using AI to register the moods of unwitting passengers. As government links up databases the challenge for privacy advocates becomes harder and the false positives will grow as much of the data sets are likely to be full of ‘dirty data’. Remember the name on your passport does not have to mirror your birth certificate.
A personal case in point, like many I do not use my first name on my birth certificate, not even my second. A personal experience I had many years ago when council tax first came in, I had a running battle incurring months of stress and threats from the council for non-payment. Why because they had my name combination in 3 permutations across various systems that suggested 3 different people. I am not alone by any means.
Imagine that, a grim Saturday morning, you glance at your phone as you rush for the train, frowned face, maybe a bit of pre-commuter scowl. The software flags you as agitated. A mild stutter in your step from juggling coffee and bag? Alarm bells. Before you know it, you are a ‘person of interest’, a nervous suspect in a sea of faces. Good luck proving the AI got it wrong. No matter how innocent you are, you might become a target, some algorithm’s error, some misreading of brows, posture, facial lines. Emotion recognition technology remains highly contested; many experts warn it cannot reliably interpret internal emotional states from outward expressions.
If you combine the whistle-blowing culture urging neighbours to report each other , with pervasive surveillance that pretends it can see inside your head, you begin to smell the rot, a society where anybody can be reported, flagged, scanned, judged, often wrongly, all in the name of security and public service. Trust frays. Privacy evaporates. Innocence counts for nothing.
We do not need soldiers in riot gear to get dystopia. We just need cameras, unaccountable algorithms and the pernicious business of incentivising suspicion. One minute you are enjoying a quiet rural life or walking down a city street. The next, you are under surveillance, not just for what you do, but for how you feel.
When suspicion becomes a method and emotion becomes data, the real theft is not of money, it is of dignity, autonomy and mutual trust. Yes TRUST, a theme you will have read extensively from me, it touches everything.
Remember when trust breaks, truth becomes negotiable. When trust grows, progress becomes possible. In a world built on doubt, trust is not naïve, it is revolutionary.
Posted on December 6, 2025
0