5 min read

AI Can’t Read the Room: What Leaders Still Need to Slow Down For

AI is speeding up how we build, but it cannot replace the moments where leaders need to slow down and pay attention.
AI Can’t Read the Room: What Leaders Still Need to Slow Down For

There is a quiet shift happening in software teams right now, and it is not just about speed.

We can build faster than we ever could before. Features that used to take days can now take hours. Entire sections of an application can be scaffolded in a single sitting. Problems that once required deep focus and patience can often be resolved with a well-phrased prompt.

On the surface, this feels like clear progress. In many ways, it is. But speed has a way of hiding what it removes, and what it quietly leaves behind.

One of the clearest places I have seen this is in the rise of what people casually call “vibe coding.” It has become remarkably easy for someone, even without a traditional software development background, to build something that works. They can connect pieces together, generate logic, and get a functioning application up and running in a day.

That accessibility matters. It opens doors. It allows ideas to move faster than they ever have before.

But the moment that work enters a team environment, the reality becomes more complicated.

What looks complete on the surface often carries hidden costs underneath. The structure is inconsistent. Files are scattered in ways that do not follow any clear pattern. Security considerations are not always obvious. And most importantly, there is very little shared understanding of why things were built the way they were.

This is where the tension begins to show up. When a piece of software has been created quickly, the expectation is that everything else should move just as quickly. When it needs to be stabilised, refactored, or in some cases rewritten, the question that follows is almost always the same.

If it was built in a day, why does fixing it take so long?

It is a fair question if speed is the only lens being used. It is a much harder question if you have spent years learning what makes software sustainable. Good systems are not just about whether they work. They are about whether they can be understood, maintained, extended, and trusted over time. Those qualities have always required care, and care does not compress as easily as execution does.

This is the part that AI does not remove. It simply makes it less visible at first.

I felt a different version of this tension when I started using AI more deeply in my own work. For most of my career, I wrote everything myself. I learned to think through structure, to anticipate problems, and to debug without relying on external tools. That process shaped how I approach engineering, and it became something I trusted.

Letting AI take on parts of that work was not immediate. There was hesitation, even resistance. It felt like stepping away from something I had spent years refining. Over time, that shifted. The benefits became clear, and I found myself leaning on it more naturally.

But that transition made me more aware of how uneven this shift can feel across a team. Not everyone arrives at the same level of comfort at the same time. Some people move quickly and embrace it. Others need space to understand it, to test it, and to trust it.

When everything around you is accelerating, it becomes tempting to expect that everyone should move at the same pace. It is an easy assumption to make, especially when the outputs look so immediate.

But understanding has its own pace, and it cannot be rushed in the same way.

I saw this very clearly while working on a recent project where I was rewriting native mobile applications into Flutter. It was the first time I leaned into AI in a way that directly shaped the structure of a real system rather than just assisting with isolated tasks.

At first, everything appeared to be working. Files were generated quickly. Code compiled. On the surface, there were no obvious issues.

And yet, something did not feel right.

The way files were organised lacked consistency. Folders did not follow a clear structure. It technically worked, but it did not feel like something that would hold up as the project grew. That instinct did not come from the output itself. It came from experience, from having built and maintained systems long before AI was part of the process.

The issue was not the tool. It was the absence of clear direction.

Once I started being more explicit about how things should be structured, where files should live, and what patterns should be followed, the output improved significantly. The quality did not change because the tool became better. It changed because the inputs became more intentional.

This is where a quieter risk starts to emerge. AI does not just accelerate good practices. It also accelerates weak ones. Without experience or clear thinking behind it, speed can amplify inconsistency rather than reduce it.

At the same time, the way we work around the code is also shifting. Because some of the deeper, more time-consuming tasks are now faster, there is more space in the day. More time to respond, more time to engage, more time to move between different threads of work.

That can be a positive change, but it also introduces a different kind of pressure. When everything becomes more immediate, conversations can become more reactive. Decisions can be made quickly, not because they are fully understood, but because the pace around them makes it feel necessary.

This is where leadership begins to shift in subtle ways.

AI can help you write code faster. It can help you move through tasks with less friction. What it cannot do is interpret the human signals that sit around the work. It cannot tell you when someone is overwhelmed, when someone is unsure but not saying it, or when a decision feels rushed even if it appears logical.

Those signals still require attention. They require presence. And most importantly, they require a willingness to slow down when everything else is encouraging you to speed up.

Right now, the hardest thing for me to slow down is not the technical work itself. It is the constant flow of ideas that come with this new pace. There are always more features we could build, more improvements we could make, and more opportunities to move quickly because the tools allow it.

The challenge is not deciding what can be done. It is deciding what should be done now, and what should wait.

Those conversations are rarely comfortable. They interrupt momentum. They require clarity and sometimes pushback. But they are also where thoughtful leadership starts to show.

AI is one of the most powerful tools we have been given as builders. It has changed how we approach problems, how we execute, and how quickly we can move from idea to implementation.

What it has not changed is the need for judgment.

Not everything that can be built quickly should be. Not every decision should be made at speed. And not every output that looks correct carries the depth required to support a team over time.

AI can help you build.

It cannot decide what matters.

And it cannot read the room.