When Guidance Becomes Compliance
A short story about Iceland, Well-Architected Reviews, and what drift really tells you
About ten years ago, when I started as a Solutions Architect at AWS, one of my early customers was based in Iceland. Good team, serious engineers, running large production systems for international customers.
On my first visit, they told me they did not want to run a Well-Architected Review (WAR).
Not ever.
A few months earlier, another AWS SA had come out, run a WAR, and basically turned it into an audit. The output was a verdict. He told them something like “your maturity is low, you are not doing things right, here is a list of what you are missing.” No conversation about context, no acknowledgment that some of their choices were deliberate trade-offs, no curiosity about why the system looked the way it did.
They were still angry about it months later, and they made sure I knew.
When I eventually did run the proper review with them, I did not open a spreadsheet. I asked them to walk me through how the system actually worked. Where the pain was. What they had tried and backed out of. What kept them up at night. We talked through trade-offs as peers. I shared some of my experiences and what I had seen other teams do, and where those choices had gone wrong. At one point they told me, quite directly, that if I had shown up doing what the previous SA had done, it would have ended badly. Icelanders are scary when they want to be.
We ended up at the restaurant that evening. We had a good night.
I think about this story a lot, because it sits right on the fault line of something I want to write about: the way guidance, over time, drifts from a tool that helps teams reason into a control that judges them.
The drift is not an accident
Here is a pattern I have watched play out repeatedly, inside AWS and outside it.
Someone smart writes down what they learned. “Here is what we figured out, adapt it to your context.” It is a practice. It lives in someone’s notebook or a team wiki. People read it, argue with it, take what works, and move one.
Then it gets popular. More people want to use it. So someone writes it up more carefully, adds structure, makes it easier to consume. Now it is a guideline. Still useful. Teams use it as a starting point for their own thinking.
Then the guideline gets adopted by more teams. And as the conversations multiply, they start to feel expensive. Each one takes a couple of hours. Multiply that by dozens of teams, then hundreds, and it looks like a lot of senior engineering time spent on something that, cumulatively, starts to look like a thing that should obviously be automated. So the content gets extracted. A checklist appears. The reasoning behind each item is still there in someone’s head, but the artifact itself no longer carries it. The checklist gets reviewed quarterly. Now it is a control.
Then the control gets audited. Teams are measured against it. Scores appear. Green, yellow, red. The checklist no longer describes a way of thinking, it describes a thing you either did or did not do. The reasoning behind each item has faded. People who implement the items often cannot tell you why they matter. They just know they get flagged if the box is not ticked.
Now it is compliance.
At each step of this drift, something is lost. The context disappears. The trade-offs disappear. The “it depends” disappears. What started as help ends as judgment.
What the drift reveals
It is easy to describe this as an accident of institutional life, but I do not think that is quite right. Organizations drift toward compliance because compliance is cheaper than learning.
Compliance is fast. You can audit a hundred teams against a checklist in a week. You cannot run a real conversation with a hundred teams in a week. Compliance scales. Learning does not.
Compliance is legible to leadership. Green-yellow-red dashboards make a VP feel like the organization is under control. A conversation about trade-offs does not fit in a dashboard.
Compliance creates accountability without requiring judgment. If something goes wrong and the team was compliant, nobody is in trouble. If something goes wrong and the team made a judgment call that did not pan out, someone is in trouble.
So when I see an organization drift from practice to control, I do not see bad intent. I see an organization optimizing for efficiency, legibility, and risk transfer. Those are real values. They are just not the same as the values that produce good systems.
Good systems come from teams who can reason about their context, make trade-offs deliberately, and adapt when conditions change. That requires learning. Learning is slow, messy, and hard to measure. So organizations that are under pressure to move fast and look accountable drift away from it. Not maliciously. Just gravitationally.
This is one of the core tensions I write about in Why We Still Suck at Resilience. Organizations say they want to learn from incidents, improve resilience, understand how their systems actually work. Then they build operating models that reward efficiency, legibility, and control. The two sets of values are in direct conflict, and when they conflict, efficiency wins. Drift toward compliance is what that looks like in the artifacts.
Well-Architected as a case study
Well-Architected started as a very useful artifact. In its early form, it was a structured way for AWS to have an informed conversation with a customer about their system. The pillars were prompts. The questions were designed to open up discussion. The output was shared understanding, not a judgment.
Then it scaled. More SAs needed to run reviews. TAMs got involved. Training got standardized. Lenses got added. A tool was built. Dashboards appeared. Partner programs were built on top of it. Customers and partners started asking for “Well-Architected certifications.”
Somewhere in there, the practice became a control. In some corners of the ecosystem, it became a compliance exercise. The reasoning behind the questions got compressed into answers you needed to give. The conversation got replaced with a form.
This is not a story about anyone behaving badly. Most of the people involved at each step were trying to help. The SA who audited my Icelandic customer a few months before me was probably trying to be rigorous. The person who built the first checklist was trying to make it easier for more people to benefit from the framework. The team that built the tool was trying to scale a good thing.
But the sum of all those well-intentioned steps was drift. And by the time I arrived in Iceland, a team had already learned that Well-Architected meant being judged by someone who did not understand their context.
To be clear, there are still SAs out there running great Well-Architected Reviews. I know some of them. They still treat the questions as prompts, the pillars as openings for a real conversation, the output as shared understanding. When it works, it really does work. The framework in the hands of a thoughtful reviewer remains a useful thing.
But the more customers I talk to, the more the word itself has become a problem. Well-Architected, for a lot of people now, means a monster process. Means judgment. Means a scorecard they do not want to submit to. I hear it in how they brace when the term comes up. They have been burned, or they have watched another team get burned, and the institutional memory of that is hard to undo. Whatever good work individual SAs are doing inside the framework is fighting against the reputation the framework itself has earned.
What the on-call engineer knows
The people who run systems at 3am know something that compliance frameworks cannot capture. They know which trade-offs were deliberate. They know which risks they are carrying on purpose. They know the history of the system, why it is shaped the way it is, what has been tried and why it did not work.
Guidance that respects that knowledge helps them reason. Guidance that ignores it judges them.
The test, for any framework, is whether it makes the team on call better at their job or whether it makes them better at passing audits. Those are different things. Most frameworks start out doing the first and, if nobody resists the drift, end up doing the second.
Resisting the drift is a choice, and it is not a free one. It means accepting that the framework will scale more slowly. It means leadership has to be willing to tolerate variance in how teams apply it. It means auditors have to accept that “it depends” is a legitimate answer. It means the framework itself has to be treated as a living thing, revised as the work changes, not frozen the moment it gets popular.
Most organizations will not make that choice. The gravity is too strong.
The organizations that resist the drift end up with teams who can actually reason about their systems. The ones that do not typically end up with a lot of green dashboards and a lot of incidents that nobody saw coming.
Closing
The Icelandic team I worked with is still out there, still running systems, still making trade-offs they understand better than I ever could. That day, 10 years ago, I learned more from them than they did from me. That is usually how it goes when guidance is used as guidance.
The drift toward compliance is the nature of things. But noticing the drift, naming it, and choosing to resist it, in your own corner of the work, is also a thing you can do.
It just takes longer. And you do not get a dashboard for it.
//Adrian


