Today, we relaunch Take Back The Frame not as a glossy reboot, but as a louder, sharper, more intentional space for exposing the harms that systems keep hidden.
Right now, our focus is on an accountability too many refuse to name: stigma around mental health is being weaponized, not by fringe voices, but by the very systems meant to provide care, justice, and protection.
Humans Learning Humanity — While AI Learns From Them
Here’s the problem: we now have humans who must “learn” to be human again through empathy training while robots are learning from them. Welcome to Workers’ Compensation.
The concern in an AI world is very real. Last week, I listened to a great podcast from Lenny on Substack about the release of ChatGPT 5.0 — apparently the fastest growing and most successful product in history. This week, the same product’s launch received scathing reviews. That’s not just a stumble, it’s a major red flag.
Why does this matter for Workers’ Compensation? One word: Guidewire. The IT platform at the centre of so much controversy since the 2020 scandal remains a serious concern. Industry insiders tell me it’s at the root of many problems, despite millions spent on it. No one seems to know what it will actually do, and yet here we are, stepping into an AI-driven world where the stakes are higher than ever.
When Humans Forget Humanity — and AI Learns the Same Lesson
Workers’ Compensation can’t fix its own tech, yet it’s stepping into an AI future built on the same systems that already fail the people they’re meant to serve.
If the base system is broken and the technology stacked on top of it fails, what happens next? We already know that claim file documentation is still missing in many cases, a troubling pattern in its own right. So what exactly is Guidewire being used for, and what data is its AI being trained on? Does anyone truly know? AI is now embedded in almost everything for all of is, and while it can deliver huge productivity gains, in this system we have a serious problem — one that will only grow if left unchecked.
We’re not just talking about inefficiency, we’re talking about harm. And in workers’ compensation, that harm lands on people who are already vulnerable.
We need gatekeepers with lived and living experience inside the system, people who can identify blind spots before they become damage.
What I’ve Learned Up Close
I’ve met enough people in insurance, finance, social welfare — even child protection — to see the pattern clearly.
These aren’t just “broken systems.” They are systems that have systemized the humans inside them. And people are hurting.
Recently, I sat down with a senior government executive over coffee to talk about the urgent need for care for people caught in these systems, and the toll it takes when cries for help land in my lap.
I can’t simply tell someone in acute distress to “call a helpline” especially when the system itself doesn’t provide one. That’s not care. That’s walking away. And right now, there is no pathway to connect people to the right help in those moments.
I spoke about situations where individuals were in severe crisis, where timely, appropriate action could mean the difference between life and death. I made it clear: this is not my role. I’m not a mental health professional, and it should never fall to those outside the system to catch people when it fails them.
The fundamental premise of good mental health care is simple: get people to professional help. Not avoid responsibility. Not hide behind procedure.
When Empathy Leaves the Room
Here’s my takeaway from that conversation: at first, there was empathy, pen in hand, listening. But within minutes, that shifted.
The conversation moved into organizational mode: silent box-ticking, process-checking, note-taking as if we were discussing a quarterly report, not a human being’s survival. The imperative for gentleness was replaced by the safety of procedure.
And I knew it wasn’t the individual’s fault. It was an automatic, trained response. Years of conditioning, even the word “reset” was used and had created a default mode. Systems train people to be robotic about process.
They are not trained to do more than their job. “Empathy training” or generic customer care is not the same as the years of preparation healthcare professionals receive. Finance staff are trained to excel in finance, and so they should be. But here lies the disconnect: I used to think some didn’t listen. Now I know many can’t listen — because the workforce is imbalanced. In the places where care matters most, the mix is wrong.
It was a wake-up moment for me: harm will continue until there is a custodian of care and empathy at the highest levels of these organizations, not a “customer advocate” to manage complaints, but a senior leader with a permanent seat at the table whose sole mandate is to hold humanity at the centre.
Why We’re Passing the Mic
That’s why, starting today, we’re introducing Letters of Resistance, a space where those who’ve lived through these systems speak directly to you, without filters or spin.
Our first letter comes from long-time advocate Rosemary McKenzie Ferguson, who has spent decades standing with injured workers. Her knowledge is practical, hard-won, respected, and deeply human, exactly the kind of grounded expertise that should have a seat at the table but too often doesn’t.
Letters of Resistance — Edition Two
Some letters resist by telling the truth of harm. Others resist by equipping you to fight back. This is one of those letters.
Knowledge as Resistance — Taking Control of Your Workers’ Compensation Claim
Featuring: Rosemary McKenzie Ferguson
This Is Where You Come In
Free subscribers keep the conversation growing, you’ll get every Behind the Lens investigation, regular Letters of Resistance, and more.
Paid subscribers keep the work independent and unfiltered you’ll unlock extended investigations, practices, private audio, the Moral Injury Vault, and more.
This isn’t just about Workers’ Compensation or that system. We are looking at multiple intersecting systems at the moment. It is very much about what happens when systems forget how to be human and then teach those same blind spots to the machines now learning from them.
If we don’t act, AI will be trained on failure, bureaucracy, and the absence of care and it will repeat it at speed and scale we can’t undo.
We need to build, protect, and demand systems that are worthy of being copied.
Because in the AI age, what we teach is what we’ll live with.
Subscribe Free | Upgrade to Paid
Until next week,
Kathie