Folks worry a lot about building the wrong thing, that is, making software that does not please the many and different interests of the org, the users, the operators.
I’ve certainly seen that. We all have. Government seems particularly consistent at doing it, tho surely there are plenty of commercial orgs that have the same problem.
I see this problem differently from most folks. I’d like to take a few tweets and walk through my approach.
The key insight for me is exactly that brilliant subtitle I refer to so often in my work: the founding document of the extreme programming movement was called "Extreme Programming Explained: Embrace Change".
My belief is that this "wrong software" risk lies mostly in our failure to embrace change throughout the ecology in which software exists, and that more fully embracing it would reduce this risk to a minor and readily addressable concern.
"Wrong software" is fielding a program that does not please all or most or many of the players (or any?) of the players in that program’s ecology.
Oddly, though, I guarantee you that every single person within the sound of my clattering keys here, routinely uses a broad variety of "wrong X" for various purposes throughout every day, and does so with very few qualms or concerns.
Is your car perfect? Your stove? Your shoes? Your mattress? (i know your dog is perfect, I didn’t ask you about your dog.) your X, for most values of X, is not perfect.
It’s just better.
Better than nothing, for one. And better than you had before, for two, and in a rich and very complex way, better in your expectation of tomorrow, for three.
In the software world, we fret endlessly about "wrong software", in ways we rarely do in other domains, where we focus our attention almost entirely on those three forms of better.
Why?
We worry about "wrong software" because we believe that changing software that’s wrong — making it better — can not be done in a smooth stable "continuously better tomorrow" fashion.
"Wrong software" is mostly bad because "changing software" is seen as expensive and anomalous. And now we come to my main point, after which i’ll return to my dumb RPG game…
"Changing software" is expensive and anomalous because we engage in a variety of behaviors and attitudes that make it so. If we didn’t, if changing software was cheap and everyday, we’d file our risk of "wrong software" in the same place we put most of our "wrong X" concerns.
The risk of "wrong software" isn’t the risk of fielding an imperfect X, it’s the risk of making large investments in a thing that can not change to become better, and so must be as nearly perfect at creation as it will ever be.
The modern synthesis seeks to break this model. It seeks the means & manner required to make change ordinary, to embrace it, and in so doing return software to the ordinary domains in which those three modes of better are applied.