IT does the moonwalk: our endless search for absolutes

Scene: I was CTO at a high-traffic social networking site, circa ten years ago. It was one of those times when our site got crushed by unexpected sudden volume, due to being mentioned in an article in a prominent newspaper. My infrastructure manager walked into my office the next morning, ashen-faced. “We’re gonna get killed tomorrow unless we add ten front-end servers to our prod environment,” he proclaimed. A fairly common IT reaction: absolute, adamant, ominous.

Ten new servers? That was a nice round pulled-from-thin-air number, obviously, and by the time we talked through it, we actually found other, more practical, more feasible ways first to estimate and then handle the increased load. But to the infrastructure guy as he walked in, the situation was both dire and absolute, and he saw only one solution that should even be considered.

So now let’s look at another data point on IT psychology. Take the latest iPhone brouhaha: the quick “cracking” of the iPhone 5s Touch ID fingerprint scanning technology.  Amazingly, Touch ID has turned out to be less than perfect. Someone with $1,000 of equipment, plus lots of time, motivation, and patience, could conceivably fool the scanner. Meanwhile, what gets lost in the outrage over this turn of events is the notion that the technology might indeed be “good enough”, or “better than the alternative”. We forget the simple fact that the technology is primarily oriented to people who currently don’t use passcodes at all, and that it vastly improves general security for those sorts of users.  As one article pointed out, “The point of any security system isn’t to be unbreakable – there’s no such thing – but to be fit for purpose.”

My larger point: if there’s a problem or a difficulty or even a nuance to a particular approach’s applicability, a common IT practitioner’s instant reaction is that the approach or practice is absolute junk and should be completely avoided.

Similarly, we often reject fundamental improvements to a situation, simply because they are not perfect. We let “best get in the way of better.” On this general theme, an amusing tweet crossed my screen the other day. @rands wrote, “I find when an engineer says, ‘Less than ideal’, they often mean ‘Complete fucking catastrophe.’”  I laughed at this, of course, but partly because I’ve more often experienced that scenario in reverse: an engineer deciding, and then loudly and profanely proclaiming, that a situation was nothing short of a complete disaster, simply because it was less than ideal.

We in IT work in a realm where insistent idealism is rampant, oddly. It’s odd, because the need to compromise actually looms everywhere: in our systems requirements, the users we must satisfy, our budgets, our resource capability, our schedules, our legacy hardware and software. All engineering relies on the art of compromise; no bridge was ever built to withstand any possible load. Yet what’s most odd in IT is that various forms of technical, management, and methodological idealism persist, recur, refuse to die, often feeding on sheer inexperience mixed with equal parts hubris and testosterone.

So, throughout my career, I’ve seen IT people tend to gravitate to the ideal, often because we reflexively reject anything that has a perceived downside. Of course, idealism itself has an obvious downside that gets ignored in our new-found enthusiasm: nothing is actually ever completely ideal. Yet, talk to any adherent of a fad methodology or development approach, and you’ll rarely hear the associated downsides or potential flaws acknowledged in any way.

It’s IT doing a moonwalk of sorts, going forward and backward at the same time: objecting strenuously to anything with a downside, while simultaneously inclining towards often-absolute adoption of faddish ideals, and turning a blind eye to any downsides those ideals may have.

IT people typically don’t like situations where we have to deal with probabilities; we tend to prefer dealing in black and white. We want that certainty, so we seek it out, one way or another. If we can’t be certain, then our inclination is just don’t do it.  If it’s going to involve hard work on handling exceptions and edge cases, don’t do it. We look instead for an absolute, “one size fits all” substitute approach. Ten servers, installed overnight, will fix the problem just fine.

And as we seek those absolutes, we often implicitly or explicitly discard reasonable alternatives. Let’s look at some real-life examples where I’ve seen this dangerous “throw the baby out with the bathwater” attitude occur, often evoking baitful headlines in IT publications, guaranteed to be high clickthrough, such as “eliminate your QA department!”

  • Dealing with operations just makes developers get “too close to the hardware,” plus developers hate having to sit down with the ops guys. So, let’s eliminate operations altogether (NoOps).

  • People often misuse the bug tracking tool, so let’s abandon it and just use index cards instead. (With one such team, I found hundreds of index cards randomly scattered in piles around the work area, scribbled with cryptic notes.)
  • Best practices? We can’t really tell what those are, and none is truly universal anyway. Ignore best practices.

This is a list of absolutes, a catalog of reactionary resistance, representing the rejection of fairly standard approaches and techniques that have turned out, surprise, to have some downsides or to require some nuanced thinking.

Let me be clear: no one can or should argue against idealism per se; often, much progress is caused by someone who just won’t give up on their ideal picture of how something should work. Yet the above list should help underscore the obvious: IT people often take their idealism too far.  When we’re encouraged or tempted to adopt extreme stances because of perceived gaps or downsides in our normal approaches, there’s a high danger we’ll just be moonwalking: we’ll be pulled backwards as we attempt to walk forward. Our judgment is clouded by our search for absolute certainty. So, we all need to focus on cooling down from our instinctive reactions along these lines, and to look for ways to throw out just the bathwater while we hold on firmly to that darned baby.

In short: shooting too hard for the ideal isn’t always ideal. It can be simplistic and overreacting, in fact, and it often backfires. Be wary.

 

Comments

  1. Laura Lindsay says:

    I enjoyed this article — these are all too common behaviors and some of them are the stuff Dilbert comics are made of.

    But you also have me thinking about how these come into play in strategy work. In a recent situation, I’ve been involved with an organization struggling with client device strategy. The strategy is based on a directional statement along the lines of “we will eliminate desktop computers; everyone will have a virtual desktop or mobile device”.

    There are elements of goodness in this direction but it was stated by a very high level executive, and taken literally by the staff, such that there are no gray areas when it comes to execution. And now there’s a tremendous amount of moon walking going on.

  2. Interesting example, Laura. Sounds like one of those “from on high” dictates that may be having far more ripple effects and/or downsides than the executive realized. See my piece that quotes Dawn Lepore (CEO of drugstore.com), about her carefully differentiating between “is this a light bulb or a gun” when brainstorming in meetings.

    But the case you cite sounds to me a bit like the one of “we will eliminate email entirely” that I cite in the post. It takes some good points to an unnecessary extreme. Clearly, it’s not just a symptom of IT.

    Thanks for commenting!

  3. Peter, interesting article.

    While I can understand your line of thinking I’m not sure I fully support your conclusion. I’ve been in this number for some time now and I am not inclined to describe tech guys in the terms you mention. My perception of reality is that this sort of views is a relatively new phenomenon. I therefore ascribe it to a generational development and to the emergence of the agile generation. This generation, having been conditioned to reject anything that ‘smells’ like waterfall, are quick to judge any ‘old’ methods as being outdated and overdue to be replaced. This is the point where such examples as in your article kick in and the rest, as they say, is history.

  4. Shim, thanks for commenting. Interesting that you see this as a relatively new phenomenon! I agree that it appears to have worsened in recent years, and the line you draw back to Agile is notable. (I have much to say on Agile in general, both good and bad). Lots of “old” methods deserve to be scrutinized carefully and even abandoned; unfortunately, as you point out, some people take this a bit too far, simply because a method or approach is old. As my hackneyed cliche in the post puts it, they persist in throwing the baby out with the bathwater.

    And it’s a constant surprise to people that we keep reliving the same old IT problems!

    Unlike you, though, I do believe that there’s something of large nature, if not critical mass, in the psychology of many IT people, which often leads them to embrace certain absolutes with a fervor that transcends logic or experience or even basic scrutiny. Sometimes that pulls them to new fads or movements. Sometimes it consists of actually clinging to the past (older technologies, for example), just because it is known and comfortable. (I worked with a guy in the ’80s, for example, who declared he saw no reason to ever learn any programming language other than PL/I).

    Thanks for commenting; I appreciate your perspective!

Trackbacks

  1. […] IT does the moonwalk: our endless search for absolutes — by Peter Kretzman | CTO/CIO Perspectives […]

Speak Your Mind

*