Scene: I was CTO at a high-traffic social networking site, circa ten years ago. It was one of those times when our site got crushed by unexpected sudden volume, due to being mentioned in an article in a prominent newspaper. My infrastructure manager walked into my office the next morning, ashen-faced. “We’re gonna get killed tomorrow unless we add ten front-end servers to our prod environment,” he proclaimed. A fairly common IT reaction: absolute, adamant, ominous.
Ten new servers? That was a nice round pulled-from-thin-air number, obviously, and by the time we talked through it, we actually found other, more practical, more feasible ways first to estimate and then handle the increased load. But to the infrastructure guy as he walked in, the situation was both dire and absolute, and he saw only one solution that should even be considered.
So now let’s look at another data point on IT psychology. Take the latest iPhone brouhaha: the quick “cracking” of the iPhone 5s Touch ID fingerprint scanning technology. Amazingly, Touch ID has turned out to be less than perfect. Someone with $1,000 of equipment, plus lots of time, motivation, and patience, could conceivably fool the scanner. Meanwhile, what gets lost in the outrage over this turn of events is the notion that the technology might indeed be “good enough”, or “better than the alternative”. We forget the simple fact that the technology is primarily oriented to people who currently don’t use passcodes at all, and that it vastly improves general security for those sorts of users. As one article pointed out, “The point of any security system isn’t to be unbreakable – there’s no such thing – but to be fit for purpose.”
My larger point: if there’s a problem or a difficulty or even a nuance to a particular approach’s applicability, a common IT practitioner’s instant reaction is that the approach or practice is absolute junk and should be completely avoided.
Similarly, we often reject fundamental improvements to a situation, simply because they are not perfect. We let “best get in the way of better.” On this general theme, an amusing tweet crossed my screen the other day. @rands wrote, “I find when an engineer says, ‘Less than ideal’, they often mean ‘Complete fucking catastrophe.’” I laughed at this, of course, but partly because I’ve more often experienced that scenario in reverse: an engineer deciding, and then loudly and profanely proclaiming, that a situation was nothing short of a complete disaster, simply because it was less than ideal.
We in IT work in a realm where insistent idealism is rampant, oddly. It’s odd, because the need to compromise actually looms everywhere: in our systems requirements, the users we must satisfy, our budgets, our resource capability, our schedules, our legacy hardware and software. All engineering relies on the art of compromise; no bridge was ever built to withstand any possible load. Yet what’s most odd in IT is that various forms of technical, management, and methodological idealism persist, recur, refuse to die, often feeding on sheer inexperience mixed with equal parts hubris and testosterone.
So, throughout my career, I’ve seen IT people tend to gravitate to the ideal, often because we reflexively reject anything that has a perceived downside. Of course, idealism itself has an obvious downside that gets ignored in our new-found enthusiasm: nothing is actually ever completely ideal. Yet, talk to any adherent of a fad methodology or development approach, and you’ll rarely hear the associated downsides or potential flaws acknowledged in any way.
It’s IT doing a moonwalk of sorts, going forward and backward at the same time: objecting strenuously to anything with a downside, while simultaneously inclining towards often-absolute adoption of faddish ideals, and turning a blind eye to any downsides those ideals may have.
IT people typically don’t like situations where we have to deal with probabilities; we tend to prefer dealing in black and white. We want that certainty, so we seek it out, one way or another. If we can’t be certain, then our inclination is just don’t do it. If it’s going to involve hard work on handling exceptions and edge cases, don’t do it. We look instead for an absolute, “one size fits all” substitute approach. Ten servers, installed overnight, will fix the problem just fine.
And as we seek those absolutes, we often implicitly or explicitly discard reasonable alternatives. Let’s look at some real-life examples where I’ve seen this dangerous “throw the baby out with the bathwater” attitude occur, often evoking baitful headlines in IT publications, guaranteed to be high clickthrough, such as “eliminate your QA department!”
Email is a drain on time, so let’s stop using email altogether.
Dealing with operations just makes developers get “too close to the hardware,” plus developers hate having to sit down with the ops guys. So, let’s eliminate operations altogether (NoOps).
Pictures always tell people more than just text, so let’s avoid using text as much as we can. ( “If you want to convince someone, draw them a picture,” said Roam. “It works much better than words.”)
- People often misuse the bug tracking tool, so let’s abandon it and just use index cards instead. (With one such team, I found hundreds of index cards randomly scattered in piles around the work area, scribbled with cryptic notes.)
Best practices? We can’t really tell what those are, and none is truly universal anyway. Ignore best practices.
There are “No IT projects!” Some projects have not had enough business involvement, so let’s mandate that all projects must be led by the business.
QA makes developers lazy about coding, so let’s get rid of QA entirely, and make software developers responsible for catching their own bugs.
This is a list of absolutes, a catalog of reactionary resistance, representing the rejection of fairly standard approaches and techniques that have turned out, surprise, to have some downsides or to require some nuanced thinking.
Let me be clear: no one can or should argue against idealism per se; often, much progress is caused by someone who just won’t give up on their ideal picture of how something should work. Yet the above list should help underscore the obvious: IT people often take their idealism too far. When we’re encouraged or tempted to adopt extreme stances because of perceived gaps or downsides in our normal approaches, there’s a high danger we’ll just be moonwalking: we’ll be pulled backwards as we attempt to walk forward. Our judgment is clouded by our search for absolute certainty. So, we all need to focus on cooling down from our instinctive reactions along these lines, and to look for ways to throw out just the bathwater while we hold on firmly to that darned baby.
In short: shooting too hard for the ideal isn’t always ideal. It can be simplistic and overreacting, in fact, and it often backfires. Be wary.