6 Comments
User's avatar
Bianca Schulz's avatar

Very good analysis. I would add one thing: companies probably want to reduce simple administrative roles or simple team members. In my experience, however, the real problems are leaders who can neither contribute technically, nor in terms of domain expertise, methodology, or process design, let alone shape any of these. So the question is how efficient it will really be when you let people go, because the wrong people stay in.

I also think that if you genuinely redesign workflows with AI and properly dig into the subject matter, entirely new creative ideas for new business areas emerge, and then you need people again, the very people you may have just let go.

Perhaps in the end the winners will be the companies that are being built right now, setting up their organizational structure much flatter and process-oriented from the start, going AI-first from day one.

Andrei Savine's avatar

Keeping "wrong" people and letting go those who should stay has a very strong impact.

Because the company's culture is not what's said on the slide deck or on the wall posters.

It is who gets promoted, and who gets fired. The rest is just lyrics.

So, before anyone is let go, the company must create a process of reskilling, recalibrating and reinserting them into the new model.

"Wait, what new model?"

The new operating model that defines "what job is done", "where in organisation it's done" and "what role will fulfill the job".

Then there's a LNA (learning needs analysis) that defines for everyone concerned - what skills do they have, what skills do they need, what training and learning path has to be put in place, what projects they will be working on, what is the new career development path, and so on.

If these steps are skipped or neglected by leadership, then the company will hit the wall soon enough and will make it to statistics of failed AI transformation. Or maybe even the news.

Bianca Schulz's avatar

100% !!!

I assume that you will need a lot of people just to get the software engineering harness of AI agents and all governance related topics right.

My advice to companies would be: everyone with a quite general role, must either learn business, or data, or software engineering, or infrastructure, or governance.

Lots of work to do, learning will be an ongoing learning journey.

Peter Rex's avatar

What you're describing from the inside of the organization, I've been watching from the outside — and the well metaphor holds from both angles. The difference is that from where I'm standing, I'm not sure the organization can be convinced to put the bucket down. Not because the math is wrong, but because the math is right on the wrong timescale. Quarterly reporting is a structural problem, not a persuasion problem.

The dog in the park isn't in the ROI deck. That's the whole issue.

https://peterrex1.substack.com/p/the-well-and-the-bucket?r=60rv9f

Andrei Savine's avatar

You are right, Peter. The well metaphor is powerful and well suited.

It's all about incentives on the right time scale. CFO probably won't really care about 3 years horizon. But he/she will care about failed AI transformation rollout. Or efficiency gain that was expected but never delivered on scale.

Peter Rex's avatar

I really hope that.

Where I have my doubts is in their conclusion.

Will they adopt the “human investment” or will they decide: “We need a different kind of worker”?