The case against #NoEstimates, part 3: NoEstimates arguments and their weaknesses

I’ve spent the last two blog posts introducing the #NoEstimates movement, first discussing what it appears to espouse, and presenting some initial reasons why I reject it. I then covered the many solid reasons why it makes sense to use estimates in software development.

This time, let’s go through, in detail, the various arguments put forward commonly by the NoEstimates advocates in their opposition to estimates and in their explanation of their approach. Full disclosure: I’ve attempted to include the major NoEstimates arguments, but this won’t be a balanced presentation by any means; I find these arguments all seriously flawed, and I’ll explain why in each case.

Here we go, point by point:

  • “Estimates aren’t accurate, and can’t be established with certainty”

Let’s use Ron Jeffries’ statement as an example of this stance:

“Estimates are difficult. When requirements are vague — and it seems that they always are — then the best conceivable estimates would also be very vague. Accurate estimation becomes essentially impossible. Even with clear requirements — and it seems that they never are — it is still almost impossible to know how long something will take, because we’ve never done it before. “

But “accurate” is simply the wrong standard to apply to estimates. It’d be great if they could be totally accurate, but it should be understood at all times that by nature they probably are not. They are merely a team’s best shot, using the best knowledge available at the time, and they’re used to establish an initial meaningful plan that can be monitored and adjusted moving forward. They’re a tool, not an outcome. As such, the benefits of estimates, and their contributions to the planning and tracking process, exist even without them being strictly “accurate” per se. These benefits were itemized in my last post.

Knowing the future precisely isn’t what estimating is about, actually. It’s a misunderstanding and a disservice to think it is. Here’s why.

Perhaps we can’t really predict the future, yet we have to, every day. The reality: anyone running a viable business has to do exactly that, predict the future, in multiple ways. The only way to live life, much less run a successful business, is to achieve competence in what amounts to predicting the future. Business leaders seldom can wait for perfect information to emerge before making decisions. Every business plan ever developed essentially entails “predicting the future” and figuring out how/what to do: what markets to enter, what products will be viable to offer, how much raw materials one should have on hand, etc. Essentially, with everything we do, we’re implicitly weighing the world of options, risk, cost, value. Yes, we’re “predicting the future.”

Can we always be right as we do this? Of course not. Yet that seems to astonish and dismay the NoEstimates advocates. Case example: a major NoEstimates advocate discusses planning, strategy, and goal-setting, only to make the odd statement that “all these techniques require perfect, 20/20 foresight.” Such a view is quite misguided. In fact, there’s no plan, no estimate, no forecast that is ever intended to provide perfect, 20/20 foresight. None. To believe that it would or could, in fact, reflects a major misunderstanding of planning, estimating, and forecasting. In reality, all those approaches are simply techniques that one uses as a way of tilting the odds in one’s favor, and as a means to monitor progress against expected results and react accordingly.

Not to mention: from the point of view of working with business stakeholders, the quickest way to demonstrate one’s lack of business savvy in the real world is to scoff that no one can possibly predict the future.

  • “Estimates are misused by management

Most NoEstimates presentations include citing example after example of the ways that estimates are misused, usually using fun cartoons and snidely subtitled photos.Asking for estimates

Amazingly, that’s not especially useful. As Glen Alleman wrote, “anecdotes of failed or bad management processes cannot be the reason for abandoning those processes.” Hearing endless examples of misuse is like watching someone go into lengthy detail about all the ways you can get an infection in hospital, with the end goal being to argue we need to eliminate hospitals.

And by the way, it underscores the weakness of their argument when NoEstimates advocates cite the number of people who retweet an anti-management Dilbert strip as a good metric to support their case.

“Estimates get turned into commitments

NoEstimates advocates emphasize (and lament) how estimates often are interpreted as a commitment or even a target. While this can be true and reflect a misuse of estimates, one also needs to recognize that commitments do need to be made in business, and that they always come from what amounts to an estimate, in some form, that a particular goal can be successfully met. NoEstimates people talk about drip funding, and promise stakeholders that they “can pull the plug quickly when necessary“, but at the same time, they say things like “we can commit on that part which is known: mostly tomorrow, maybe the day after. Longer commitment is not possible.”  Or, “I prefer to commit only to what is needed in order to validate or deliver value to the customer now”. (Emphasis provided)

What, development won’t commit on any item past the day after tomorrow?! Business people will often rapidly conclude (fairly or not) from such anti-commitment talk that the people saying such things must actually have little or no idea or confidence in what they can deliver and when. And no real skin in the game: minimal planning, close to zero commitment. Trust deteriorates rapidly as a result.

Commitment and goal-setting matter for internal reasons as well. Without solid commitments to steer towards, even a competent, motivated team can drift along and not prioritize well: yak shaving, for example. Commitment and clear targets tend to focus both the mind and the daily “to do” list. It’s frankly a wonder that this would even be a source of controversy.

Estimates are just one part of an iterative, collaborative process of evolving a meaningful, achievable plan, getting everyone on the same page, ensuring (to the maximum degree possible) that the major necessary chunks of work have been clearly and transparently identified up front, alternatives identified and openly discussed, and, finally, yes, appropriate commitments forged. That process is not dastardly or underhanded or abusive or manipulative, it’s normal and desirable. There’s nothing wrong with it.

This is an odd instance of somberly intoned faux-wisdom, skewed logic, akin to something like this: there’s no need to drive safely; we just need to avoid having an accident. Similarly, estimates serve as a time-honored means to the end: in fact, their purpose is actually to enable decisions. Without them, you’re making your decisions blindly, without full consideration of their likely impact. If all we truly need are decisions, we might as well flip a coin.

More analogies come to mind in response: having a map doesn’t guarantee you won’t get lost, so why bring one. A weather forecast doesn’t guarantee it will or won’t rain, so why even consult it. But of course, no one ever expects those approaches to be infallible; their purpose is to increase your odds of anticipating and/or dealing with the unexpected. The above NoEstimates question reveals a strange conception of “guarantee”, and in fact betrays a certain kind of reluctance to accept ever being wrong. But as with the above points about estimates being “accurate”, an estimate is also never a “guarantee”. It’s an informed best shot. Equating any estimate to a “guarantee”, or objecting that it isn’t one, shows great lack of understanding what estimates are or how used.

For some reason, this question is always raised triumphantly by NoEstimates advocates, as if it were an unanswerable, debate-ending, “drop the mic” moment: but it sorely misses the point.

Again going back to basics: estimates won’t necessarily be “accurate”. They’re not plans, or commitments, or schedules, and (most especially) they’re not guarantees. Estimates provide useful input to creating such things; they’re an attempt to “tilt the odds” in the favor of success. If you’re looking for anything to provide you with a 100% guarantee, you shouldn’t be in business at all.

Here’s what happens: if your estimates and your plan, closely monitored during the course of your project, reveal unexpected variation and/or gaps, you adjust appropriately. The NoEstimates gleeful consternation that plans don’t typically go completely as scheduled in every aspect is simply unfathomable. Every project plan ever made ends up having variance from the plan. Having a plan is what lets you react and adjust in the first place. If there is no plan, no target, there’s nothing in particular to push you into having to adjust; you just keep coding away.

As for bias, it seems way better to get bias out explicitly on the table, through open declaration and active group scrutiny of estimates, than to have it enter through the backdoor silently.

  • “Estimates make it harder to change the plan”

Estimation up front makes it harder for us to change the plan because as we define the plan in detail, we commit ourselves to following it, mentally and emotionally.

Why would that possibly be the case? Don’t let that happen! Estimates should be getting recalibrated constantly, feeding in any new information and (quite possibly) changing the approach, the timeline, the staffing, the scope of your overall effort.

As new information arises, any plan can be changed, assuming that there’s a sober evaluation of the pros and cons of doing so. Note that there’s always a healthy tension between adhering to a game plan vs. panicking entirely and ditching the plan as soon as the first obstacle rears its head. Part of the role of a skilled product manager, and of management in general, is treading that narrow line, deciding when and how to react and regroup.

NoEstimates advocates often paint their critics as inflexible, “big design up front“-oriented, waterfall-obsessed dinosaurs. That’s a straw man. Analogy: if you’re taking a trip, there’s a very happy medium between “we’ll head east and completely wing it as we go” vs. “we’ll plan every stopover to the nearest second, up front.” In other words, you plan the trip, to the degree you deem appropriate for your needs, and you adjust appropriately as new information hits your radar. Some plans will need to be changed in major ways; others will absorb the inevitable minor variances and be (broadly speaking) just fine as initially defined.

  • “Estimates take too much time.” “Estimates are expensive to gather.”

It depends. It’s always easy to trot out egregious straw-man examples: don’t spend more time estimating etc. than “doing work”, one NoEstimates advocate has been known to tweet, as if that’s a common scenario. Of course it isn’t: time spent planning needs to be weighed (as with everything in business) from a cost-benefit perspective: if I’m about to spend $10 million on a software project, it’s worth spending six figures constructing estimates and a workable plan against which progress can be measured. Align the time and expense of estimating with your “value at risk”, in other words. This, again, should be common sense.

This may be the weakest NoEstimates argument of all. Of course estimates are not required. A driver’s license isn’t strictly required to physically drive a car; a sterile environment isn’t required to operate on a patient; having a referee isn’t required to play a game of soccer. Equally, doing upfront planning or even using common tools like bug tracking or source code control aren’t required to deliver software; I’ve (unfortunately often) seen teams go without such things. But just like a reasonable level of planning and the use of appropriate tools like source code control, rational use of an estimating process can indeed contribute meaningfully to overall high-quality, timely delivery of software capabilities for an enterprise. Pointing out that estimates aren’t required in order to code, as if anyone had been claiming they were, seems (quite intentionally) to miss the point.

OK, no, this is the weakest argument, because it’s not a bona fide argument at all. I really wish I could cite the specific better ways that NoEstimates advocates have proposed in lieu of estimates, but the declaration is usually formulated as above, vaguely, with nothing behind it. The only concrete “better way” that seems to get cited consists of drip funding for the team (“you can pull the plug whenever you want!”), story slicing of whatever piece of work is mysteriously judged as “most important,” frequent delivery of software, and then iterate/refactor. The need for estimates then disappears, or so claim the NoEstimates advocates.

None of that is intrinsically a bad idea for some development circumstances, but it’s most certainly not universally applicable. And it doesn’t answer the up-front questions that don’t go away: what will be the likely total investment we’re making to get the full capability we need? And when will we get that capability?

  • “We’re not ditching estimates!

Despite this now oft-repeated claim, the key NoEstimates advocates have yet to name any concrete examples of when estimates, in their view, actually are appropriate. As a result, NoEstimate’s claim not to be ditching estimates comes off as defensive, disingenuous, and insincere, considering the intensity of the diatribes they’ve delivered against estimating.

In truth, NoEstimates advocates appear to acquiesce to doing estimates only in cases where they are basically forced to do so, by unreasonable and/or unenlightened management. But this kind of grudging acceptance of the need to do estimates, while metaphorically holding one’s nose (“estimates are a smell of possible decision-making dysfunctionality“), is no way to win the understanding of or promote a strong collaboration with business people who have to grapple with uncertainty in their own areas every day. In fact, such a clearly insincere, pro forma acquiescence is also likely to fail to incorporate all the positive elements and outcomes that a high-quality estimating process would bring to the table.

So there you have it, the principal arguments raised by NoEstimates in support of their case.

One last installment will be forthcoming in this series, giving my “bottom line” assessment of estimates vs. NoEstimates.

Comments

  1. Just figured something out. The statement of Ron,
    Even with clear requirements — and it seems that they never are — it is still almost impossible to know how long something will take, because we’ve never done it before. “
    This is a sole contributors paradigm. Pretend we’re playing our roles at PWC (CIO advisor, Program Performance Management advisor) and have been asked by a new customer to develop a new product from scratch.
    What would we do? We’d go to our resource management database – held by HR – and ask for “past performance experience” people in the business domain. Our new customer did not “invent” a new business domain. We’d look to see where in the 195,433 people there is someone that knows what the customer wants. If there is no one, then we’d look in our 10,000 or so partner relationships worldwide to find someone.
    If there is no one, we’d not bid.
    This notion of “doing something that’s never been done before” really means “we’re doing something I’ve never done before.” And since “I’m a sole contributor, the population of experience in doing the thing the new customer is asking for a ONE – me.
    So the answer to the question “what if we encounter new and unknown needs, how can we estimate?” is actually a real problem for the sole contributor, or the small team. This is the reason the PWC’s of the world exist. They get asked to do things the sole contributors never get to see.

  2. Well, I think you’re right, but I also think that Jeffries would take umbrage at this description. He likely believes that it doesn’t matter; even if he has experience doing dozens of similar systems to what’s being asked for, it’s still not exactly the same, so uncertainty reigns. An odd point, in my view, but there it is.

    Although these people are in a sense factually correct (every piece of software is new by definition), there’s nothing to me (after 30+ years of cutting code) that indicates that one’s personal experience in similar circumstances can’t usefully inform one’s outlook about the level of effort. My dentist putting on my crown doesn’t go on and on about how she’s never put a crown on THIS particular tooth before. There’s a possibility that software may be rather more unique (for that seems to be the NoEstimates argument) in its considerations than dentistry, but in my mind, not so much.

    Thanks for commenting.

  3. I’d suggest from my narrow Software Intensive Systems (SIS) point of view, that the missing element in that “it’s new” argument is – Systems Engineering. SE is the discipline to discover the Functional Area Analysis, Functional Needs Analysis, and Functional Solutions Analysis:

    ◆ Functional Area Analysis (FAA) is a capabilities based task analysis that provides the framework for assessing required capabilities for the success of the system.
    ◆ Functional Needs Analysis (FNA) assesses the ability of the current system to accomplish the needed tasks identified in the FAA, in the manner prescribed by the concept under the operating conditions and to prescribed standards.
    ◆ Functional Solutions Analysis (FSA) is an operationally based assessment of the potential strategies, organizations, training, leadership and education, personnel, facilities, and materials (technologies) for solving open of more of the capability needs.

    And then it can be determined if there is any “new” aspects of the needed software. But in the absence of SE, “let’s get started coding and see what emerges” has no chance to ask or answer the questions of these three processes. In our space and defense systems, it’s be nonsense to not ask “where can we look for reference designs for this new problem?”

  4. “Business people will often rapidly conclude… that the people saying such things must actually have little or no idea or confidence in what they can deliver and when. And no real skin in the game: minimal planning, close to zero commitment. ”

    I think this is the nub of a huge argument against No Estimates that they seem to completely ignore. I came into project management via marketing, and if one of my clients was bringing out a new piece of software they wanted to promote my first question would always be the same: “What will it do, and when will it launch?” No Estimates essentially makes the answer to this question a shrug of the shoulders. And if that had been the answer a prospective client had given us we would almost certainly have declined their polite offer and moved.

    Obviously any answers that were supplied were estimates, and we worked with our clients as they revised those estimates. But the important thing was that we had an idea of what would be happening, a minimum feature set, and an idea of when we needed to get things done by. Without a method for answering that fundamental question, No Estimates simply can’t be taken seriously by the wider business community.

  5. Well said!

    NoEstimates, to the degree they acknowledge this issue at all, reply with a wave of the hand: “we’re not ditching estimates.” With no specifics.

    It’s fairly clear to me and others that the main NE advocates have had little interaction with either board-level/CEO people in their own company, or with high-level business stakeholders, because the questions are always the same, and the NE non-answers are such clear non-starters for those constituencies.

  6. Yes, that “we’re doing No Estimates-type estimates” confuses me a little. I read one post this morning which talked about “There is estimation taking place, but at no point would we attach an estimate to [a story]”, which seems to be a contradiction. If there is estimation taking place estimates are being made, even if they’re only in people’s heads. And things that are only in people’s heads disappear with those heads when they go to another job or get hit by a bus.

    I would agree that the lack of business nous is what is holding the No Estimates argument back. If they could get a C-level executive to explain why the company as a whole – not just one renegade team – had stopped estimating and how they now do things like creating profit forecasts then perhaps we could take the argument more seriously.

  7. Indeed. As I wrote in the post, “it seems way better to get bias out explicitly on the table, through open declaration and active group scrutiny of estimates, than to have it enter through the backdoor silently.” Somehow people imagine that estimates happening is just fine, so long as developers are protected from doing them. I can’t see the logic in that: estimate all you want, just don’t ask anyone with any actual ability to determine an estimate based on experience and facts? Bizarre.

    Thanks again for commenting, Ben.

Speak Your Mind

*

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Mastodon