THE IMPLEMENTATION
FAILED.
NOT THE PEOPLE.
For several months I carried around a feeling I couldn't name. I was excited about AI. Genuinely, practically excited. I could see what it was changing about my own work. My productivity shifted in ways I could measure.
And I felt shame about saying so.
Not because anyone told me to be quiet. Because the gap between my experience and most people's experience was obvious, and I had no framework for talking about it honestly. The hype said AI would change everything. The reality was that most employees had no idea what it could do for them and no one had built the bridge between the technology and their actual work.
I couldn't be the hype. I couldn't be the doom. I needed an integrity position that let me advocate for the technology without pretending the disruption wasn't real. I didn't have it, and until I did, I couldn't talk publicly about any of it without feeling like I was performing.
So I dug around. For months. Reading, testing, building, breaking things to see what held up. I watched technical implementation specialists present AI tools to rooms full of people and fail. Not because the tools were bad. Because nobody had done the work of understanding what the humans in those rooms needed to hear before they could trust any of it.
The technology wasn't the failure. The implementation was. And the people who looked resistant or slow were neither. They were rational humans responding to a system that hadn't earned their trust yet.
People don't resist change. They resist bad change management. When adoption fails, the instinct is to blame the adopter. They didn't try hard enough. They're stuck in their ways. But almost every time I've traced a failed adoption back to its root cause, the failure was upstream. Insufficient context. No vocabulary bridge. No one asked what the person's actual constraints were before asking them to change.
The Dignity-Required System is my rough draft at addressing this. It maps bandwidth against identity so you can see, before you roll anything out, where resistance is likely and why. Not because resistance is a problem to overcome, but because it's information about what the system hasn't addressed yet.
When a technology rollout fails and the people get blamed for it, something has gone wrong in the investigation. The implementation failed. The people were just the ones left holding it when it did.