On my LinkedIn profile, I list one of my skills as “thought-driven development”. This is a little tongue-in-cheek; software engineering over the last few years has developed a lot of “XDDs,” such as test-driven development, behavior-driven development, model-driven development, etc. etc.
“Thought-driven development” doesn’t actually exist, but by it, I simply mean: perhaps we should think about what we’re doing, rather than reaching for a nearby methodology du jour.
In my last job, a colleague of mine used to also joke about “design-driven design” — perhaps the ultimate play on the XDDs since it is also a strange loop.
All this is not to say the XDDs aren’t useful — they definitely are. A lot of them have spawned entire groups of cross-platform open source projects. I am all for anything that makes the adoption of XYZ best practice easier for my team. But these techniques often require some lateral thinking to get to any real benefit.
When evaluating technologies like this, you have to take each little community with a grain of salt. Almost every programming framework / methodology / etc. that exists portends to offer some order of magnitude increase in software reliability / developer productivity / whatever else. And almost all, if not all, fail to do so, in practice.
Here is one anecdote to illustrate the point. From 2006-2008 at Morgan Stanley, the entire corporation was obsessed with Java’s Spring framework and its core “architectural pattern”, Inversion of Control. I can’t even begin to explain to you the number of man-hours that were wasted re-architecting existing, working software to meet this chimerical conception of component decoupling. I even contributed to this, urged by the zealots and their blind faithful. All of the reasons seemed great: decoupling code, using more interfaces, allowing for easier unit testing, being able to “rewire dependencies” and use fancy technologies like “aspect-oriented programming”.
Even Google got swept up in the madness and developed their own, competing framework called Guice. And in the end — after all that work — my diagnosis is that IoC is basically a non-starter, a complete waste of time.
A complicated framework that morphed into a programming methodology, developed exclusively to work around some annoying limitations of the Java language. Since it was applied without thinking, now everyone’s Java code has to suffer, and you can hardly pick up a Java web application today without being crushed by the weight of its IoC container’s XML configuration files. (Nevermind that most other communities, such as Python’s and Ruby’s, have hardly a clue what IoC is all about — a good enough indication that it is a waste of time.)
Every framework and approach should be judged on its true merits, that is the true cost/benefit analysis of applying that particular technology. Will it save us time? Will it simplify — not complicate — our code? Will it make our code more flexible / adaptable? Will it let us serve our users and customers better?
I regularly go back to old classics like the Mythical Man-Month to remember that nothing we do in software is truly new. I highly recommend you read it, and also its most famous essay, “No Silver Bullet”.
tl;dr stay healthily skeptical, don’t drink the kool-aid.
Hi Andrew,
I thoroughly recommend Googling Rich Hickey’s talk on “Hammock-Driven development”!!!
Hope all’s well,
John
While everyone is at it, how about listing a skill of developing driven-development!
“Almost every programming framework / methodology / etc. that exists portends to offer some order of magnitude increase in software reliability / developer productivity / whatever else. And almost all, if not all, fail to do so, in practice.”
Don’t you think that’s a bit overstated, Andrew? Is it really true that orders of magnitude increase in productivity are promised? Take Ruby On Rails. It was touted as bringing a significant developer productivity boost over Java web technology at the time, and many people agreed. But was it even one order of magnitude (10x) more productive? I don’t think so. (I don’t know if this was actually claimed.) Comparative productivity is very hard to measure. It shouldn’t be touted as strongly as it often is.
To your example of the misuse of Spring to retrofit a legacy design, I agree that it must have been a bad judgment call. But this isn’t the fault of Spring. It was a problem of failing to properly evaluate its suitability for that specific task. And a failure to abandon the misbegotten rewrite as soon as it was known to be problematic. It would be an unfortunate consequence if developers forced to use Spring in an inappropriate way came away from the experience swearing to never use Spring or dependency injection ever again.
It’s always difficult to make technology choices for a project. There are three basic strategies. Conservative: Use what worked on the last project. Adoptive: Study and select new open source frameworks. Innovative: Invent something new.
The last is often decried as the ‘Not Invented Here’ syndrome. You are unlikely to invent something that’s better than any existing framework or library. If you’re too conservative, your technology will be out of date. So you’re left with hard decisions about what to adopt and how. People sometimes do rush to jump on a bandwagon without diligent evaluation. But that is a mistake in judgment. It is advisable to conduct a trial adoption of a new framework. If your organization is small, the amount of resources for such R&D is limited. But it can still be done (the ‘spike’ in Scrum, for example). Some risk has to be taken. If it turns out bad, the best advice I’ve heard is: “No matter how far you’ve gone down the wrong road, turn back!” There is indeed no Silver Bullet in software, and this includes having no “best practice” for moving forward with technology. Do not accept top-down dictates on technology choices. Try something new, evaluate, iterate.
Man, Well said! Well said! I have lived through the same ordeal a few times in my career. People forget that Software Development is just a way to solve common problems people have. They are more interested in HOW to do things than WHAT they are doing. If the HOW gets in my way of doing the WHAT, then we need to rethink the strategy…