Wednesday, October 3, 2007

Dinosaur Programmers Know More Than You

Recently I spent several months terrorizing two software development departments, one responsible for developing the GUIs and the other the command and control functionality of a large, complicated, expensive data management and planning system.

If you're an enthusiastic, brimming-with-confidence, leading edge technology oriented software developer, you really don't want your software being wrung out by middle-aged dinosaur programmers like me. With hundreds of thousands of lines of code under my belt, major systems designed and deployed, obsessive software quality expectations, and complete disdain for programmer egos, I've got almost 25 years experience knowing what programmers forget, screw up, hope won't happen, and ignore. Yeah, because they're the things I forgot, screwed up, hoped wouldn't happen, and ignored.

I was handed the system's test procedures, got two days of hands-on training from one of the operators who was leaving the project, and then I was on my own. Following the steps in the test procedure worked more often than not, departing from them in free-play testing frequently didn't. It didn't take me long to start writing bug reports. Way too many were for stupid stuff, like not checking that the minimum altitude of a range was less than the maximum altitude, or accepting a manually-defined polygon where I'd provided only one or two vertex coordinates (a polygon requires at least 3 non-collinear ones--oh, and it also accepted 3 collinear vertices).

Using a conscientious programmer as an integration tester is a scary thing when it's your code he's testing (though it's a great thing if you're the test group manager, since the more bugs that are found, the easier it is to justify your job :-). In one instance the lat/long coordinates tracking the pointer on a map display were not being correctly transferred to the main operations windows when the mouse was clicked, they each differed by a few hundredths of a degree. The GUI developers kept claiming it was a mouse-pointer/screen resolution issue, and while such issues certainly do exist, this was definitely not one of them. I know how software works, I've dealt with resolution issues, and this was not a screen pointer/resolution issue, it was simply a matter of data not being properly transferred from one window to another. And eventually one of the developers did grudgingly dig into it and discovered that two different algorithms were being used to convert from screen coordinates to lat/long position, and that's what was causing the discrepancy.

While I'm well versed in what programmers get wrong, I also know what they (and I as a developer) need to have to fix things:

This system did a lot of logging, so I made sure to record what I was doing, what data I was using, the timestamps of when things went awry, and had all the log files collected together to turn over to the development group.

Having been the victim of inept testing I also know the number one thing not to do: If a bug has ripple effects, don't document the ripple effects as separate problems. This drove me insane when testers did it to me; when something breaks, fine, write it up, but don't keep trying to follow your test procedure exercising what you just discovered to be broken functionality! Each problem that gets written up has to be dispositioned, which takes time and effort, and there's no value in documenting and dispositioning each of the myriad and marvelous ways in which failure manifests itself. The tester is wasting time documenting things that will all go away once the root problem is fixed, and while bogus problems are being written up, the looking for more actual problems isn't happening.

A few weeks after I came onto this project the developers all pretty much stopped arguing with me.

Eventually I moved on from that project and got tasked to "web enable" a large simulation system (ah, back to development!). The first thing I start thinking about is how to go about embedding a web server within this system and how it's going to communicate with its clients, and then right on the heels of that began worrying about everything that could go wrong and how I'm going to handle those situations. What if the server goes down in the middle of a transaction? Or the client? Or the network drops out? How will shutdown take place in a half-hosed environment, for both client and server? How about recovery? Restart?

Dinosaur programmers spend far more of their time dealing with cranky software than they do with software that just works. Their jobs usually is, in fact, converting bad software into stuff that just works. And it's not AJAX, or dynamic typing, or service-oriented architectures, or parallelism, or the latest/greatest programming language that's going to make that happen.

It means taking a cold-blooded, hard look at software development practices and acknowledging the fact that writing brittle, fragile software is easier, and so that is what is done. The programmers who come to terms with this reality gain a fine appreciation for high function software that "just works", and so are unimpressed by whatever hot new technology is on the bleeding edge and is prognosticated as going to "make writing software a breeze". Because it won't.

I once fought a programming language war against two system engineers who wanted us to recode our completed, reliable, correctly functioning command and control system in C++, for no other reason than "that's where the market is going." Less than ten years later anyone citing that as a reason for such a recoding would be laughed out of the office.

Every new technology sounds great, and is capable of great things when it's optimally applied. Which it almost never is. And nothing brings out the imperfections of a new technology like real world usage, in a mission critical environment, with money on the line.

So you'll forgive me when I don't get excited about your new framework/language/architecture/process that's going to change everything/usher in a new paradigm/launch the next Google. As long as people are writing code, and assembling component frameworks, and modeling the business processes, they're going to continue to forget, screw up, hope for the best, and ignore the vague and uncertain parts.

While a new technology might eliminate a whole species of software development errors, you'll just discover, and now have the opportunity to explore, a whole new jungle of failure.

Dinosaur programmers already know this, it's our home turf.

15 comments:

Jon C said...

Found a slight typo in your article (which was a great read - thanks!). In the following paragraph, notice the sentence in bold:

"It means taking a cold-blooded, hard look at software development practices and acknowledging the fact that writing brittle, fragile software is easier, and so that is what is done. The programmers who comes to terms with this reality gain a fine appreciation for high function software that "just works", and so are unimpressed by whatever hot new technology is on the bleeding edge and is prognosticated as going to "make writing software a breeze". Because it won't."

I think it should read "The programmer who comes to terms..." or "The programmers who come to terms..."

...and yes, I'm pedantic and a perfectionist ;-)

P.s - feel free to delete / edit this post if / when you you fix the sentence.

Marc said...

Good catch. Fixed!

Anonymous said...

nice read
i get never bored of reading entertaining stuff :-)


But as far as languages go, I do feel that certain practices are better than others (sounds a bit weird heh...). Like using XML is usually worse than using simple human readable formats ;)

Friendless said...

I find your page impossible to read on my CRT unless I select all the text, but it was well worth doing so. As a dinosaur myself I feel I finally understand how to write solid software. Sadly, nobody seems to want that.

Anonymous said...

I'm a young programmer myself(22) but I see the wisdom in this article. I've mostly only dealt with my own code, but have worked up some decent sized(10-20kloc) projects and faced the consequences of bad decisions and time pressure.

My approach is mostly from a game development standpoint, and in that field the initial learning curve is always harsh, because bringing together an interesting game always takes substantial software development work. There are endless reams of libraries and frameworks and "game making" programs, but what you get out of them, as a beginner, is a basic example that shows stuff moving on the screen, and then an insurmountable barrier of "what do I do now?" Without any programming background, there's no hope of making the tool do what you want.

Thus, for someone who gets past that barrier, there is almost always a disdain for "wonder solutions" in software.

I still tend to stay on the lookout for Good Ideas. In language design, the ones that seem most useful are automated memory management and implicit static typing. Both are "bookkeeping-reduction" mechanisms that really work in practice. In specific compilers and interpreters, useful error messages and debugging options are the key. And with libraries and frameworks, nothing's a silver bullet. You just hope you can figure out what the hell is going on.

daan said...

Thanks for the post.

I was lucky enough to be thrown into a team of developers where the average age was 45 and I was but 24. The one thing the taught/mentored was that technology changes but functionality does not. Whatever you are tasked with make sure it does what it is meant to do. How you achieve, nobody gives a crap. Languages come and go but working systems stay for some reason. In the end clients don't care about AJAX, Django, Rails, Universe, Perl or whatever you pick as your weapon, just make sure you win the battle.

Anonymous said...

You might find this blog/post interesting: http://antipattern.wordpress.com/2007/10/03/42-42/

Anonymous said...

[...]recode our completed, reliable, correctly functioning command and control system in C++, for no other reason than "that's where the market is going." Less than ten years later anyone citing that as a reason for such a recoding would be laughed out of the office.

Until it is put in your contract, so you hire a bunch of college kids with EE degrees to learn how to program by going through your cvs repository and turning perfectly good C into C++.

Seriously. I'm watching it happen. It is like a train wreck in slow motion.

Marc said...

re: "turning perfectly good C into C++"

My condolences. Despite my high-level language preferences, there's nothing wrong with C in the appropriate context.

Anonymous said...

@friendless : Try the 'zap colors' bookmarklet ... straight to black on white legible bliss :)

@marc : very interesting post, hope you don't mind the recommendation above

Anonymous said...

"I once fought a programming language war against two system engineers who wanted us to recode our completed, reliable, correctly functioning command and control system in C++, for no other reason than "that's where the market is going." Less than ten years later anyone citing that as a reason for such a recoding would be laughed out of the office."

The saying "If it ain't broken, don't fix it" definitely holds true. But I would argue that there are instances where you have to keep up with the technology or you are left hanging without support. I have Microsoft in mind when I say this.

E. James said...

This was a very nice and humorous write up, however your choice to compare C++ to some newer technolgies (bunched together and un-named) is quite wrong.

In C++ you can shoot yourself in your 100 feett a hundred times and have a hell of a time finding which foot was shot and why.

In most newer languages (and I started with C++) it makes things much easier to code, much easier to read, and much easier to fix.

Although I would never say a rewrite for the sake of hype is a good thing, moving on to new technology and integrating ewn stuff into old apps is better than drudging along in a language that fewer and fewer people really understand well.

Michael said...

Like using XML is usually worse than using simple human readable formats

Have to disagree with this comment. Dinosaurs do a lot right, as was nicely explained, but one thing they do wrong is cling to inferior data formats. Fixed column width flat files, for instance, are horrible. Delimited formats are slightly better and self-describing formats like XML are better yet.

"simple human readable formats" might be easier for you right now, but are much harder for me in 5 years when I have to figure out what the 912th column is supposed to be.

Marc said...

"Delimited formats are slightly better and self-describing formats like XML are better yet."

Just for the record, I concur on the preference for XML self-describing formats, for the very reason cited.

I don't mean to imply that we dinos reject everything new, just that most everything is viewed with serious and ongoing skepticism, and only those things that make it through the tar pit get accepted and put to work.

Worthless Programmer said...

@E. James: the comparison with C++ was appropriate, because it was probably the latest fad at that time. That's how I read it.

Besides: if you think the problem with C++ is that it is too easy to shoot yourself in the foot, then you just need to learn to aim.

Once you get over the memory issues (by learning how to avoid them and/or using smart pointers and other things; also some discipline wouldn't hurt), then you'll see that all there's left are exactly the same issues. Switching to another language doesn't make you better, as if by magic...