The Luxury Layer
Source: Dev.to
Episode I
I would like to begin with a single‑question survey:
Do you need thinkers or doers?
The likely answer is doers.
So, are thinkers really needed?
It is a luxury to afford employing a group of thinkers.
What I Mean by “Thinkers”
You may have noticed—or are about to notice—that in this age of AI, roles or titles such as Principal Engineer, Specialist, Architect, and similar are being reframed as:
- Too narrow
- Non‑executional
- Redundant
They are called “redundant” because, eventually, work becomes detailed enough to match a thinker’s vision. Those visions aim to:
- Reduce future failures
- Prevent scaling disasters
- Encode tribal knowledge
None of the above shows up cleanly in any quarterly EBITDA, making it difficult to justify these roles. They would remain redundant until things actually start breaking.
The Promotional Message to Curtailed Thinkers
“We don’t need thinkers anymore; we need doers + tools.”
AI is indeed useful. It replaces:
- Repetitive synthesis
- Boilerplate decision‑making
- Low‑context abstraction
However, low‑context abstractions are insufficient to grasp the unsaid, the tribal knowledge. In its current form, AI needs significant help to develop:
- Systems intuition
- Failure anticipation
- Boundary judgments
I am not saying AI can’t do it, but AI + a thinker would pair nicely.
What Thinker Roles Provide
- Long‑term systems thinking
- Tacit, experience‑earned knowledge
Replacing judgment with tools won’t make systems safer. We are entering an era where bad architecture will be shipped faster and paid for later. Soon it will be frustrating for thinkers to justify the “luxury layer.”
Earlier in a career, justifications are part of growth; after years of grinding, the justification can feel like an erasure. The value of these roles is:
- Cumulative
- Contextual
- Earned through scars
AI is great, and its worth is understood in:
- Automation (tools)
- Acceleration (productivity)
Judgment, on the other hand, is not as easily ratified. It requires high‑context systems thinking. Over time, AI will catch up, giving rise to newer problems that expand the context even further.
The Hidden Value of Thinkers
The value of such roles resides in the negative space—the things that don’t break. Their impact is delayed or not even felt; their work is preventive, not demonstrative. This work often goes unnoticed and can feel dispensable, even though it is essential for long‑term stability.