top of page

Nobody Wants To Hear This About AI And Software Timelines

  • Writer: Prathamesh Kulkarni
    Prathamesh Kulkarni
  • 19 hours ago
  • 6 min read

I am one of the biggest AI fanboys you will meet. I use it daily. I believe in it. I have seen it genuinely make people more productive. So when I say what I am about to say, understand it is not coming from a skeptic. It is coming from someone who has been in the trenches long enough to be genuinely concerned.


AI did not create the timeline problem. But it poured gasoline on it.


The Dream Being Sold


Somewhere between Andrej Karpathy coining "vibe coding" and every LinkedIn influencer posting a 30-second reel, the industry collectively lost its mind about what software delivery actually looks like.


The narrative became simple. AI writes code fast. Therefore, software ships fast. Therefore, your timeline should be shorter. Therefore, if it is not shorter, your team is the problem.


That narrative is being sold in boardrooms, in sales calls, in client meetings. And the people selling it are not lying out of malice. Most of them are just confused. AI is a new field. Companies are experimenting. Leadership is figuring it out in real time. I get that. But here is what I do not get: how that confusion consistently rolls downhill and lands entirely on the people actually building things.


What An Estimation Actually Needs To Cover


Let me tell you what a software project timeline actually looks like from the inside.

There is discovery. There is planning. There is architecture. There is a deployment pipeline setup. There is documentation. There is bureaucracy, approvals, and change management. There are integrations with third-party systems that are poorly documented and maintained by someone who left the company two years ago. There are production issues that nobody scheduled, but everybody knew were coming. There is firefighting. There is a client who was supposed to deliver APIs in week two and delivered them in week nine.


None of that is exotic. All of that is normal. All of that takes time.


But estimations are rarely built around any of that. They are built around development time. Development, which is honestly one of the smaller pieces in the grand picture of shipping something real. Everything else is treated as background noise, as if it will just sort itself out.


And then, when the project is at 60% development, and the client is expecting UAT, everyone acts surprised. It is not a failure. It is a slow burn that was visible from day one to anyone paying attention.


I know a project that was pitched as a clean three-month job. In and out. Simple. It took ten months to reach something stable enough to call production-ready. Not because the team was incompetent. Because nobody accounted for reality.


The Half-Knowledge Problem Is Real, And It Is Expensive


Here is a specific kind of pain I want to talk about.


There is a person. Usually on the client side, sometimes internally. They are technical enough to be dangerous. They had a conversation with ChatGPT or Claude, read half an article, and watched a YouTube video. Now they are an architect.


I have sat in calls where someone insisted we needed RAG for a use case that had absolutely nothing to do with retrieval. It did not matter how many times it was explained. They had read something, their AI chatbot agreed with them, and that was that.


These kinds of recommendations, made with confidence and zero accountability, cost four months on a different project I know of. I would have built that use case in two weeks with a straightforward approach. But because someone with half-cooked knowledge threw a shiny technology into the mix at the wrong stage, four months evaporated.


And what happened to that person? They moved on to the next meeting. Spreading more recommendations. Zero consequences.


The people who ate those for four months? That is a different story.


Leadership Sells It. Devs Deliver It. Nobody Asks Who Answers For It.


Here is the dynamic I have watched play out more times than I can count.


Client asks if the timeline can be shorter. Leadership, because there is a deal to close and money on the table, says yes. The devs are told the new timeline. Some of them push back. The pushback is heard, acknowledged, and ignored. The project proceeds.


Months later, when things are behind, the client is frustrated, the backlog is massive, and UAT is nowhere in sight, the conversation quietly shifts. It becomes the dev team's problem. Leadership goes back to the client, hat in hand, asking for an extension. And somehow, in the retelling of the story, the team that was given an impossible timeline becomes the team that could not deliver.


I have been in those rooms. I have given accurate timeline estimates. I have watched them get overridden in the next conversation because they were inconvenient for the sale.


There are no consequences for the person who sold the dream. There are only consequences for the people who could not manufacture reality fast enough to match it.


The Trap Nobody Is Talking About


AI is supposed to make you faster. Okay. But here is what that has quietly turned into in a lot of organizations.


The slower people get fired. This is the first wave. Then the expectation recalibrates upward for everyone who remains. Now you need to be fast AND use AI. And if you are using AI and still not hitting these newly invented timelines, that is somehow even worse. What are you even doing?


It is a trap with no exit. People are not adopting AI out of excitement. Most of them are adopting it out of fear. Unlike people who had years to develop a genuine interest and comfort with technology (like me, who started my career with this technology), a huge number of people in this industry are being forced to learn under threat. And learning under threat does not produce mastery. It produces anxiety dressed up as productivity.


The deep generalists (me), the people who can cover sales, architecture, deployment, firefighting, documentation, and client communication, keep their jobs. Good for them. But now the expectation is that one person covers everything that four people used to cover, faster, with AI assistance, at the same or lower cost.


And that person is scared. They know it. They do not say it. Because the moment you slow down, the moment you push back, someone is running the numbers on whether your salary is more expensive than a better AI subscription.


So they keep their head down. No questions. No pushback. Just deliver until they cannot deliver anymore.


Nobody calls it burnout. It just looks like productivity from the outside.


Yes, AI Is Genuinely Useful. That Is Not The Point.

Productivity is real. I have seen it. Tools like BMAD genuinely change what a person can do. AI is not a scam. In the hands of someone who knows what they are doing, it is legitimately powerful.


But that productivity gain has been taken, inflated, stripped of all context, and turned into a justification for decisions that were always going to hurt people.


"You are using AI, right? So why is this taking so long?"


Because we are building a real system. Because the client has not delivered what they promised. Because we are waiting on approvals. Because something broke in staging, and we are not shipping broken things to production. Because estimation is not just counting development hours and calling it a timeline.


What I Actually Want You To Walk Away With


I am not writing this to attack anyone. Leadership is confused. Clients are being sold dreams before problems are even properly understood. The industry is new at this. Everyone is figuring it out.


But figuring it out cannot keep meaning the same people pay the price every single time.

If you are a leader, ask yourself honestly: when did you last give your team a timeline and actually ask them if it was real? Not as a formality. Actually ask.


If you are a client, ask yourself: Do you actually know what you want yet? Because the dream you were sold and the system you need are often two completely different things, and the gap between them has a cost.


If you are a developer grinding through an impossible timeline right now, head down, scared, delivering: I see you. This is not normal. It should not be normalized.


The tools got faster. The expectations got louder. The humans in the middle are still just humans.


That math does not work. And pretending it does is not optimism. It is just expensive denial.


My hatred for "that should be pretty simple" started way before the AI era, but it has amplified even more now.

Recent Posts

See All

© 2026 by Prathamesh Kulkarni.

bottom of page