The case against #NoEstimates, part 2: why estimates matter

Last time, I provided an introduction to a very odd and very vocal recent movement known as #NoEstimates, which seeks ways to reduce or eliminate the use of estimates in software development. I started off my discussion of it by going through some basic common-sense business reasons to reject it. Those reasons for rejection boiled down to:

  • estimates are flat-out natural, ubiquitous, and unavoidable in practical life and in business;
  • expressing general reluctance to do them unfortunately reinforces the often negative perception of IT people as aloof, uncooperative, and unsavvy about business imperatives.

Let’s look now at the many other solid reasons to keep estimates in the software development and project management toolbox:

Estimates help in project selection over a wider time frame, and they assist in filling in a project portfolio evenly to align with overall team/company capacity. Specifically: companies often need to determine which projects to sink time and company resources into: e.g., needing to pick just three out of a list of fifteen proposed projects for a given time period. This is a common and recurring dilemma in every company I’ve ever worked at, where demand for various business functionality always exceeds the supply of resources to fulfill it.

In such a ubiquitous scenario, what makes anyone think that the anticipated cost and duration for delivering each potential project should not figure into the decision process that selects the “vital few” from among the many choices? Why would you possibly rule out considering those factors to the best of your ability? Yes, of course they should be weighed along with value, risk, and other factors, but cost and schedule will always be among the key considerations to juggle.

You’re thinking that assessing probable cost and schedule can’t be done, because you’re uncomfortable with uncertainty, and perhaps you might get it wrong? I’m going to be blunt: then you’re not ready to play strategically in the business arena.

Estimates reduce overreliance on experimenting with “red herring” projects. Nothing is wrong with selective and judicious experimentation on a small scale, focused on learning and adjusting, but the universal NoEstimates answer to the project selection conundrum is to “just start”. Well, simply as a practical matter, you can’t “just start” 15 different projects as an experiment and gather enough useful, consistent information on all of them to feed into your decision meaningfully. Even if you could and did, you’d still have to use your gut (i.e., you’d have to estimate) based on what that information is telling you, since very few of the unknowns associated with each of the potential projects would have been eliminated via the short experiment. The NoEstimates panacea of “story slicing” doesn’t help you here at this up-front stage: unless (and even if!) you story-slice the entire project (which would be BDUF), unknowns will almost certainly continue to lurk in your backlog, some big, some small.

There’s simply no way around it: at some point, if you want to choose purposefully among competing options of similar value, you have to take a shot at determining what each project is likely to take (cost, schedule, dependencies), despite imperfect information.

You don’t like it that there’s a possibility that you might get it wrong and pick the “wrong” projects? Bluntly, again: then you’re not ready to play strategically in the business arena. [Read more…]

The case against #NoEstimates, part 1: introduction and common sense

In the immortal words of Pogo, “we have met the enemy, and he is us.”

The long-held stereotype of IT portrays us as uncooperative, unable to integrate socially, always arguing over nits, and deeply, intractably immersed in our own tunnel vision and parochial perspective.

That stereotype has held us back, as individuals and as a profession: it’s actually one root cause for the oft-lamented situation of the CIO needing a seat at the executive table and not getting it.

Now a whole movement has arisen that unfortunately reinforces that negative perception of IT people, a movement that coalesces around the Twitter hashtag #NoEstimates (all movements need a Twitter hashtag now, it appears). It started with long screeds about how inaccurate estimates are for software development: they’re nothing more than guesses; “there is nothing about them that makes them necessary, or even beneficial to the actual creation of software”. They’re a “wasteful and deceptive practice“, lies, and needing them is even comparable to how heroin users need their heroin. You can’t predict the future, these estimate detractors insist. The #NoEstimates rhetoric has become increasingly harsh, and replete with drastic imagery: estimates are a “game of fools“; an “inherited disease for the industry“; “we predict like gypsy ladies.” In what seems at times to be some kind of “top the previous hyperbole” competition, estimates have even been referred to as “management by violence”.

This sort of overreaction, this IT resistance to estimates, isn’t really wholly new, of course. More than once, I’ve watched in horror as an IT person earnestly explained to a company’s senior management how predicting systems delivery is so very difficult and so very filled with uncertainty, justifying how (for example) they couldn’t possibly commit to having a system deployed in a particular quarter. One time, I witnessed the VP of Sales in particular having zero patience with that: “Hey, I get asked to set my sales quotas way in advance based on my best professional judgment; what makes you folks in IT think that you should be an exception?” Business runs on goals and commitments, and on the “best judgment” estimates that help create those goals and commitments. And everyone in the room seems to understand that, except IT.

Setting achievable, concrete goals is healthy, in business and in life. Making solid, reasonable commitments is healthy. Taking responsibility for meeting one’s commitments all or at least most of the time is natural and should be encouraged. And if you’re perceived as constantly wailing that you’re different and special, in fact so special that you believe you can’t really be expected to state when you’ll be done? That’s a non-starter. This blog post is an introduction to how some IT practitioners have been pulled in by the seductive but ultimately wrongheaded NoEstimates claims, and what the resulting implications are for our industry.

What’s #NoEstimates about? Well, its proponents are actually quite slippery on providing any solid definition, reflexively pushing back whenever anyone tries to summarize it for them, and they often prefer to vaguely say it’s just “a hashtag for a concept about alternatives to estimates and how they might help make better decisions.” However, it seems to boil down to a virulent and unshakable base conviction that estimates are utterly horrible, for all the reasons stated above and more. Using estimates allegedly leads to chronic abuse of developers by management, cements inflexibility into projects, and disturbs developer “flow”. The sole alternative to estimates that the NoEstimates advocates clearly identify, however, seems to simply restate the age-old Agile principles of small teams, user stories sliced into doable chunks, use of “drip funding” so as to achieve a predictable fixed cost, and software delivery “early and often”. They point out that software project scope tends to change drastically and frequently during such frequent delivery, and argue that this scope variability allows projects to be declared as done and successful, long before and without ever having actually delivered all the upfront-identified desired functionality.

Over the next two blog posts, I’ll first lay out the reasons to see considerable value in a software development estimating process and its outcomes, and then respond to the myriad NoEstimates complaints that are levied against estimates as used for software development. But we’ll start here, as an intro, just with invoking basic common sense about life and business.

[Read more…]

Towards a more balanced list of content about #NoEstimates

Both my readers will have noticed there’s been a fairly large gap between my posts here, as life (picnic, lightning, and all that) has intervened. Like J.D. Salinger, however, I have continued writing drafts on various topics, and I plan to post more in the coming months.

My past posts here have often delved into a favorite theme of mine: that IT people tend to go to extremes, often rejecting something useful (an approach, a technology, a tool) simply because it has downsides. Such rejection is at times emotional and even self-righteous; we can get so caught up in it that we fail to look at a topic at all evenhandedly, let alone dispassionately.

No better case example along these lines has come along in the past year than the active and contentious #NoEstimates debate on Twitter and in the blogosphere. I’ll have a much more detailed post soon about my objections to the #NoEstimates approach overall (full disclosure: I’m one of its most vocal critics), but right now, let’s focus on one aspect of the relentless advocacy I see in the hashtag’s proponents: its lack of evenhandedness.

Specifically, proponents of #NoEstimates insist repeatedly and proudly that they’re “exploring”; recently, one major advocate tweeted out a call for links to posts about the topic (“I’m gathering links to #NoEstimates content”) so that these could be collected and posted. Yet, it turned out that only posts advocating one side of the issue would be included, even though the resulting list of links was then touted to people who might be “interested in exploring some ideas about #NoEstimates.” When challenged on this dubious interpretation of the meaning of “exploring”, the advocate then defiantly attached a disclaimer: “Warning! There are no links to “Estimate-driven” posts”. In short, making the exploration balanced wasn’t even remotely his goal.

Advocates can use their own blog for whatever purposes they want, of course. Yet, there’s an interesting split going on here: staunchly claiming to be “exploring”, while rejecting the inclusion of any summarizing or critical posts, and then sneeringly labeling all such posts as “estimate-driven.” There couldn’t be a clearer case study of IT black-and-white-ism, them vs us. Explore all you want, this behavior says, as long as you’re doing it on my side of the issue and on my terms. What, there’s a post that attempts to summarize both sides of the argument? Not interested.

[Read more…]

IT does the moonwalk: our endless search for absolutes

Scene: I was CTO at a high-traffic social networking site, circa ten years ago. It was one of those times when our site got crushed by unexpected sudden volume, due to being mentioned in an article in a prominent newspaper. My infrastructure manager walked into my office the next morning, ashen-faced. “We’re gonna get killed tomorrow unless we add ten front-end servers to our prod environment,” he proclaimed. A fairly common IT reaction: absolute, adamant, ominous.

Ten new servers? That was a nice round pulled-from-thin-air number, obviously, and by the time we talked through it, we actually found other, more practical, more feasible ways first to estimate and then handle the increased load. But to the infrastructure guy as he walked in, the situation was both dire and absolute, and he saw only one solution that should even be considered.

So now let’s look at another data point on IT psychology. Take the latest iPhone brouhaha: the quick “cracking” of the iPhone 5s Touch ID fingerprint scanning technology.  Amazingly, Touch ID has turned out to be less than perfect. Someone with $1,000 of equipment, plus lots of time, motivation, and patience, could conceivably fool the scanner. Meanwhile, what gets lost in the outrage over this turn of events is the notion that the technology might indeed be “good enough”, or “better than the alternative”. We forget the simple fact that the technology is primarily oriented to people who currently don’t use passcodes at all, and that it vastly improves general security for those sorts of users.  As one article pointed out, “The point of any security system isn’t to be unbreakable – there’s no such thing – but to be fit for purpose.”

My larger point: if there’s a problem or a difficulty or even a nuance to a particular approach’s applicability, a common IT practitioner’s instant reaction is that the approach or practice is absolute junk and should be completely avoided.

Similarly, we often reject fundamental improvements to a situation, simply because they are not perfect. We let “best get in the way of better.” On this general theme, an amusing tweet crossed my screen the other day. @rands wrote, “I find when an engineer says, ‘Less than ideal’, they often mean ‘Complete fucking catastrophe.’”  I laughed at this, of course, but partly because I’ve more often experienced that scenario in reverse: an engineer deciding, and then loudly and profanely proclaiming, that a situation was nothing short of a complete disaster, simply because it was less than ideal.

[Read more…]

Starting points for the quantitative CIO: downloadable basic tools

Much as in any field, IT executives constantly have to seek a balance between idealism and pragmatism. Given a particular problem and the range of possible solutions, do we insist on “doing it right”, or do we buckle down and “just get it done”, even with gaps?

There’s obviously no single right answer, which is what makes IT consistently so fun and frustrating at the same time. Over time, though, my own approach has typically been to focus on the continuous improvement aspect of “doing it right”: whenever possible, get something going as a start, then hone it over time as you learn more about the problem and your situation.

Using spreadsheets as a management tool definitely falls on the “just get it done” side of this spectrum of approaches. Spreadsheets are seductively easy, omnipresent, and usable by people with a variety of skill sets and technical savvy.

But there’s a host of downsides: spreadsheets are frail creatures. Errors can creep in fairly easily, even for experienced users, as data and circumstances change, and spreadsheets are especially prone to the incursion of silent errors and omissions when undergoing revision.  And once implemented, in all their imperfection, spreadsheet-based solutions can broaden and become large-scale, long-term systems (I’ve seen this happen again and again).

Yet, I feel that every technology executive should be maximally fluent in spreadsheeting: simple tracking, analyzing, modeling alternatives, understanding costs and risks. The technology provides a readily available, easy way to knock out quick and dirty models that can clarify one’s thinking and approach enormously. They work well, as long as you keep in mind that the spreadsheet is usually a stop-gap, for those times when you are faced with a glaring need and you don’t have time, budget, or staff to implement anything deeper right away.

In an early blog post, I listed the seven areas where a quantitative approach is especially necessary for the technology executive:

[Read more…]

CMO outspending the CIO on technology: “so what?” Here’s what.

Rarely do I write targeted responses to specific blog posts, but last week, a CMO-related article crossed my screen that I think is both representative of many people’s attitudes, and enormously flawed in its assumptions, logic, and conclusions. Esmeralda Swartz, writing for ReadWrite.com, titularly opines the following: “So What If Chief Marketing Officers Outspend CIOs On Enterprise Tech?” Even more grandiosely, the post’s subtitle is “Isn’t it possible that a technology buying process driven by marketers instead of technologists will make things better?

Well, I suppose I should allow that anything might be possible, but no, not by the unconvincing (yet not atypical) line of argument Swartz pursues, and not when you consider standard business realities. Here are a few representative quotes related to the backbone of her argument, namely that buying technology is like buying a new car:

  • “Let’s look at an everyday example. Prior to investing large sums of money in a new car, few people feel the need to master the inner workings of the internal combustion engine. “
  • “Despite all this blindness, for the most part, what we buy doesn’t let us down.”
  • “Ultimately, we’ve got a problem that buying a car solves, so we buy a car.”
  • “Buying software – wait for it – simply because it threatened to get the job done – will likely ruffle some feathers.”

Here’s the thing, though. IT systems are not cars. 

[Read more…]

IT conferences for the CIO: microcosms of industry trends

I’m back from attending ServiceNow’s Knowledge13 conference last month in Las Vegas, and have a grab bag of random thoughts and reactions to share as a result. As usual, these thoughts reach beyond any particular vendor or product niche.

For anyone not familiar with this company, ServiceNow is slowly and steadily developing a generalized platform (“ERP for IT”) for enterprise IT management, all the way from IT service management (ITSM) to (now, in a new offering) cloud orchestration and management of instances.

My attendance last year at this same conference broke a personal streak of almost 8 years of avoiding conferences altogether. My recap post from last year discusses how I discovered what I’d been missing: exposure to new approaches, new energy, and new perspectives that, like it or not, don’t just come from online.

In fact, it reminds me of the classic Woody Allen line about “I need the eggs”. Conferences are messy, chaotic, overwhelming, sipping from a firehose, and so on. But we keep going, because we need those eggs.

Here are some “eggs,” large and small, that I took away from this year’s experience.

[Read more…]

CDO: The Chief Déjà Vu Officer

Whac-a-mole. It’s my favorite of all metaphors, at least when it comes to applicability to IT. For those who don’t know the background: Whac-a-mole  is a commonly seen arcade game, where plastic moles pop up at random through holes in the game panel. The job of the player, of course, is to pound them down again with a mallet, accumulating points with each kinetic, mind-clearing, vigorous whack. And, of course, the game keeps speeding up. The moles never stop coming.

Any readers who don’t instantly get the clear analogy to IT are probably reading the wrong blog.

A career spent in IT feels like a constant bout of Whac-a-mole. But here, again, is one key recurring “mole” that I find especially irritating: the proliferation, against all logic, of articles and tweets about the demise of IT, the death of the CIO, and how technology is now so easy, so omnipresent, that experts are no longer required.

I wrote about this ever-repeated meme a year ago in a post titled “IT consumerization, the cloud, and the alleged death of the CIO”.  I railed against the meme, pointing out that “this frequent linking of cloud and IT consumerization to the looming demise of the CIO and IT is not just misguided, but actually gets it completely backwards. In fact, I argue that IT consumerization and the cloud will actually elevate the importance of IT within a company, as both a service and a strategic focus.

But IT moribundity is a meme that somehow refuses to, uh, die.

[Read more…]