Wednesday, October 31, 2007

Why What You Don't Know Keeps Growing--It's Geometry!

A geometrically insightful way of explaining why the more you know, the more you realize you don't know:

"As our circle of knowledge expands, so does the circumference of darkness surrounding it."

Monday, October 22, 2007

Bugs. Infestation. Vulnerability. Gauging your Level.

A good way to start a war among software developers is to state that it's impossible to write bug-free software for anything more complicated than a "Hello World" program. And that goes for claiming the reverse as well. Though "Helo World" has even had its problems at times.

However, there are Bug Infestation Vulnerability Levels (BIVLs) that correspond to software development practices, and these can be a useful shorthand for identifying just how much you're wanting to work at, and spend on, bug eradication.

(First off, we do need to make some assumptions, like that the hardware works, and that the compiler correctly compiles, or that the virtual machine or interpreter correctly executes the byte codes or other constructs making up the program.)

OK, so let's start with ...

BIVL 0: Invulnerable.

Believe it or not, by leveraging formal mathematical techniques like Z Notation and the B Method, and by it now being practical to perform program verification to formally prove correctness, it is actually possible to write defect-free non-trivial applications, and incredibly enough, still actually be seriously productive in doing it, at least in a corporate, mission-critical environment.

But then when you think about it, why shouldn't you have a decent productivity rate when employing such formal practices? Once the code is written, compiled, and formally proven correct--meaning zero defects--there's no debugging (because there's no bugs, get it?), so there's no tracking down problems, no devising fixes, and no patch integration. Kinda knocks a lot of time off the test phases, eh?

Now whether what you write conforms to your requirements is another matter, but as far as software implementation goes, this is as close to defect-free software development as you're going to get, and it's no longer just for trivial applications.

BIVL 1: Best Effort

Despite all the proven benefits in saving time and effort involved with the use of formal methods and proving program correctness, it remains limited to a very, very small number of projects, usually just those with extraordinary safety or security requirements.

Frankly, the rest of us programmers just want to do some design and then get coding. For those who are really passionate about writing high quality software, but just can't get into the formal stuff, obsessing about doing everything else possible to keep bugs out of the code results in BIVL 1 software. That means:
  • Using a programming language that helps prevent or flush out errors
  • Employing assertions, design contracts, or the equivalent
  • Code inspections
  • Expecting any invocation of any external service to fail sooner or later
  • Distrusting the validity of everything that comes in from an external interface. Or from any code you didn't personally write. Or from code that you did personally write.
Making a BIVL 1 Best Effort is not defined in terms of the programmer's "best effort", i.e., the best they themselves are inherently capable of. It's actually putting into practice those objective programming practices listed above (along with others) that result in making the Best Effort possible to produce high quality software. Doing this conscientiously can even make a Terrible Programmer look pretty good.


BIVL 2: I've got to get this done tonight.

This, unfortunately, is the norm. You gotta get the code out the door, or over to test, and there just isn't time to add code to validate every external interface (which should all be working correctly anyway), or figure out what kinds of assertions to put in the code, and no one has any time to inspect anything; but hey, you're a good programmer, or at least "extremely adequate". Sure, you check for the stuff you know could fail, like a file not found, or losing a connection, but beyond that the likelihood of a function failing really is exceedingly rare.

And besides, all software has bugs.

At this point click over to your favorite software project failure statistics.

The obvious irony of this is that part of the reason there isn't time to put in all the checking and validation stuff is because of having to spend time fixing bugs that, well, could've been caught sooner and fixed more easily if more error checking and validation (and paranoia) had been incorporated earlier in the development cycle. So valuable time is spent going back and fixing bugs, and retesting, and reintegrating, and redelivering, and reviewing and prioritizing the never-ending stream of bug reports.

Why is it there's never time to do it right, but there's always time to do it over?

BIVL 3: "Hey, at least it didn't crash..."

If you're going to write code this crappy, why even bother? What are you hoping to accomplish?

One guy I worked with spent a week developing some aero data analysis functions, which then ended up on the shelf for a few months. He moved on to another department, and so when it came time to integrate these functions into the full system another developer picked them up. This latter developer decided to hand run some of the functions just as a sanity check, and kept getting incorrect results. When he dug into the code he discovered that the analysis was all wrong, and that the functions never came up with the right results. He ended up taking another week or so to rewrite it, this time verifying that the outputs were actually correct, instead of just non-zero. The original developer? I think he was ladder-climbing into management--probably just as well.

Maybe your software does happen to work correctly, but only when absolutely nothing unexpected happens, when all services and interfaces work properly and return only valid data, when nothing is at (or beyond) the boundary conditions, when all inputs fall within the realm of "normal" use.

And if something off-kilter does occur? Well, you have no idea how the software is going to react. Maybe it will crash, maybe it will start acting flaky, and maybe there will be absolutely no indication at all that something has been going awry for a long, long time.

What did you think was going to happen?

Summarizing the Bug Infestation Vulnerability Levels:
  • 0 - I have proven that nothing can go wrong (really, no joke).
  • 1 - I know everything that can go wrong.
  • 2 - I don't know what all can go wrong, but I'm pretty sure when it's going right.
  • 3 - Is this thing on?
The amount of time spent testing and integrating always exceeds the time spent coding. It's a tired refrain, but true nonetheless: Spending more time in coding, and conscientiously coding defensively, reduces test and integration time, and therefore overall development time.

Driving down your BIVL can actually reduce development time by markedly reducing the rework portion of the development schedule. There are techniques to drive that level down that result in increased code quality. Some are more formal than others, but the main requirement is recognizing that good software is difficult to write, and then being mentally prepared and disciplined and equipped with the right techniques and tools to manage that difficulty.

Good code is doable, and doable efficiently.

What's your Level?

Looks Like a Good Time for Techies to Take a Career Risk

Regardless of boom/bust cycles, technology jobs are, and show every sign of continuing to be, in high demand. Steve Tobak recommends taking advantage of an opportunity:

"You know why I'm confident that risk-taking is good for you? Because, my interpretation of the Labor Department's data is that you've got the biggest safety net of all time under you. Use it."

Samurai Programming

Excellent post with suggestions on getting good code out the door.

"* The Way of the Code Samurai *

Now, I don't actually know much about real samurai, but the basic thing I've heard is they stand and stare at each other for hours, and then suddenly BAM strike once and the other guy is down.

That's how you should code."

So I'm not the only one that paces around my cube, sketches on the whiteboard, walks the halls, and thinks for hours and sometimes even days about how to approach a hard problem, until...it's ready. I then sit down, fire up the editor, and go to it.

There's more, too, in that post, which closely align with the coding philosophies and techniques I've successfully used.

Sunday, October 21, 2007

A Christian Realist's (Brief) Perspective on General and Special Revelation

Back when I was a kid the minister or Sunday school teacher, I don't remember which, was talking about how God provided us with "General Revelation", which was the material universe, and "Special Revelation", which he memorably described by holding up a Bible and noting that he was holding the sum total of God's Special Revelation.

General Revelation tells us about God because God caused the universe to come into being and it therefore reflects His nature, and Special Revelation was explicitly provided to us by inspiration, dictation, and the observation and recording of historical events.

The two must be in harmony because God is rational and so it would make no sense for these two representations of God's nature to be in conflict with one another. Throughout history, however, there has been a perceived conflict between the Universe as we understand it versus the content of the Bible, a conflict that continues to the present day.

General and Special Revelation, while related, are distinct bodies of knowledge. Each informs the other, and our goal is to increase our understanding of them, coming closer to the fundamental Truths of each.

Our knowledge of what the universe is, from quarks to cosmos, evolves over time as we're able to build on the knowledge that was gathered before, and as our technology gives us access to information about the universe that was previously inaccessible.

Likewise our understanding of the Bible grows and changes over time, as our culture and society become more sophisticated: no more the explicit subordination of women, tolerance of slavery, or genocide. Arguing that the Bible is always taken literally, and not subject to interpretation, is specious: for example, very few conservative Christian churches implement 1 Cor 11:5-6 regarding the covering of women's heads in church.

Our knowledge of the universe is imperfect, and our wisdom in interpreting the Bible is imperfect, but in my opinion we have a good overall handle on both. When the two come into apparent conflict, research, reconsideration, and refinement of our understanding of one or the other (or both) is needed.

The material universe, and our knowledge of it, is objective. It can be observed, measured, and experimented upon. It works consistently and reliably in its framework of physical laws, and it truly has no secrets, only things that have not yet been discovered.

Special revelation provides an explanation and rationale for our place in the universe, it addresses existential, ethical, moral, and philosophical issues of what is humanity, and how we should then live.

The Special Revelation of the Bible builds on the General Revelation of the universe, so it cannot contradict it.

When the two appear to conflict, more research and study, in pursuit of a more complete understanding, is needed. The Bible "amplifies" the understanding and explanation of nature, it does not define it. The material universe is as it is, its reality is its own definition.

Astronomy and geology have determined that the universe is 13.7 billion years old, and the earth 4.6 billion, respectively, and these are objective facts. The "young earth" interpretation some make of the Bible's Genesis account of creation that concludes the Earth is 6000 years old flies in the face of the facts, and so is clearly wrong. The "six days" of creation obviously can't be interpreted to mean six literal 24-hour days, because that interpretation conflicts with reality. And that should've been the end of that debate once the ages of the Earth and the universe had gotten pinned down.

Putting primacy on one's interpretation of Biblical passages when discussing objective reality, i.e. the behavior and composition of the universe, is misguided, it's mixing apples and oranges. Too many Christians don't understand this, and think that citing the Bible to support one's belief about any particular subject, spiritual or material, is irrefutable proof that that belief is a fact. (In actuality, the facts of the nature of the material universe are right in front of one, and all that's required is a willingness to study, analyze, and understand. And maybe an NSF grant or two :-)

Since the Bible literally defines the Christian faith, citing the Bible is perfectly valid in that domain, e.g., "What is the Christian conception of 'Heaven'?".

But it's much less so when stepping outside that domain: "How do you know there's a heaven?" "Because the Bible says so." Such an assertion is completely lost on an atheist or anyone who doesn't acknowledge the Bible as an authoritative document.

Christians, especially conservative ones, don't seem to grasp this latter point--because that authority is an intrinsic part of their belief system, discounting it is inexplicable, and so it's concluded that the work of the Devil is the ONLY possible explanation.

For a person to become a Christian, they need to understand what the Christian faith is, and accept that it's a valid faith. Now one obviously uses the Bible to describe what it is, to show that it is internally consistent, and to show that it is consistent with human nature, but one can't use the Bible to "prove" the Bible. A person who moves from non-believer to believer may have it come on them like a bolt from the blue--an epiphany--or it may be a long, intellectual struggle that brings one to belief, or more commonly something in between, like the example of one's whose "life changed" due to the acceptance of Christianity.

The better that Christians understand this, and what the limitations of Biblical argument and interpretation are, the more effective they can be in bringing people into God's kingdom.

But the more they try to assert the preeminence of one's interpretation of Biblical passages that touch upon the natural world in a way that seems inconsistent with observed reality, the more foolish they come across (and I don't mean foolish in a humble, edifying way, I mean foolish in a "you're wrong" kind of way). And this just makes the task harder, because not only are Christians perceived as being out of touch with reality, but by not realizing it, and worse, unwilling to alter their beliefs in the face of reality, give the perception that Christianity is more about blind loyalty to a set of detached-from-reality beliefs, rather than the living, breathing, growing faith that it is.

Tuesday, October 16, 2007

Say Hello to my Leetle Friend



After an Alabama summer of chowin' down on whatever it is praying mantises eat, they're finally ready to tackle the big stuff by early fall--mice, small pets, irritating neighbor children.

This guy's every bit of six inches long, and cuts a sharp profile:



Off to the hunt...

Friday, October 12, 2007

"Dinosaur Programmers" Illustrated

The perfect illustration for an earlier post.

Titan's Lake Country

Here's a nice flyover of all the radar imagery of Titan's lake country, up in the North Polar region. Bring a jacket, it's cold--around -179 Celsius.

Ah, lakes and frigid temps, ya sure reminds me of home. :-)

Monday, October 8, 2007

Design Patterns Indicate Programming Language Weakness?

Mark Dominus at The Universe of Discourse examines the role design patterns are playing in software design today and draws some conclusions about what their use and promotion indicates about the state of programming languages.

My reaction to his conclusions is mixed.

Dominus looks way back to how the "subroutine call" design pattern eventually got subsumed into programming languages, thereby removing the need for programmers to explicitly code saving parameters and the return address, in favor of simply "making a function call".

Similarly, one can do object oriented programming in C by following a widely used "Object-Oriented class" pattern, which has since been directly incorporated into other programming languages via "class definition" semantics.

He concludes that "[design] patterns should be used as signposts to the failures of the programming language. As in all programming, the identification of commonalities should be followed by an abstraction step in which the common parts are merged into a single solution."

There's definitely merit to this argument, as evidenced by his citing the incorporation of subroutine and class "patterns" into programming languages. Even more advanced patterns, such as MVC, he observes are starting to show up in programming systems such as as Ruby on Rails.

The aspect that gives me pause, though, is just how where does one draw the line when considering whether to "incorporate a design pattern" into a programming language?

Nowadays, it seems like a no-brainer to incorporate subroutines, classes, and concurrency as built-in programming language features.

But is MVC an appropriate pattern to build in? How far should you go building in direct support for distributed processing into a programming language?

Left unchecked, using design patterns as a guideline for programming language evolution will result in the accrual of more and more language features, with a concomitant increases in complexity, specialization, and learning curve.

Are we considering leaving the era of the "general purpose" programming language behind? And what does it mean to the underpinnings of our software technology if we do?

Saturday, October 6, 2007

The Appendix: Boot ROM of the Digestive System

Some scientists think they may have figured out the purpose of the long thought useless appendix:

"Diseases such as cholera or amoebic dysentery would clear the gut of useful bacteria. The appendix's job is to reboot the digestive system in that case."

Solar Shingles(!)

While we're not looking to reroof the house for several years yet, the idea of using the roof for power generation is going to get a serious look.

While it's still pricey, though at least now in the "willing to consider" range, in ten years or so I'd expect it be quite cost-effective.

Wednesday, October 3, 2007

Dinosaur Programmers Know More Than You

Recently I spent several months terrorizing two software development departments, one responsible for developing the GUIs and the other the command and control functionality of a large, complicated, expensive data management and planning system.

If you're an enthusiastic, brimming-with-confidence, leading edge technology oriented software developer, you really don't want your software being wrung out by middle-aged dinosaur programmers like me. With hundreds of thousands of lines of code under my belt, major systems designed and deployed, obsessive software quality expectations, and complete disdain for programmer egos, I've got almost 25 years experience knowing what programmers forget, screw up, hope won't happen, and ignore. Yeah, because they're the things I forgot, screwed up, hoped wouldn't happen, and ignored.

I was handed the system's test procedures, got two days of hands-on training from one of the operators who was leaving the project, and then I was on my own. Following the steps in the test procedure worked more often than not, departing from them in free-play testing frequently didn't. It didn't take me long to start writing bug reports. Way too many were for stupid stuff, like not checking that the minimum altitude of a range was less than the maximum altitude, or accepting a manually-defined polygon where I'd provided only one or two vertex coordinates (a polygon requires at least 3 non-collinear ones--oh, and it also accepted 3 collinear vertices).

Using a conscientious programmer as an integration tester is a scary thing when it's your code he's testing (though it's a great thing if you're the test group manager, since the more bugs that are found, the easier it is to justify your job :-). In one instance the lat/long coordinates tracking the pointer on a map display were not being correctly transferred to the main operations windows when the mouse was clicked, they each differed by a few hundredths of a degree. The GUI developers kept claiming it was a mouse-pointer/screen resolution issue, and while such issues certainly do exist, this was definitely not one of them. I know how software works, I've dealt with resolution issues, and this was not a screen pointer/resolution issue, it was simply a matter of data not being properly transferred from one window to another. And eventually one of the developers did grudgingly dig into it and discovered that two different algorithms were being used to convert from screen coordinates to lat/long position, and that's what was causing the discrepancy.

While I'm well versed in what programmers get wrong, I also know what they (and I as a developer) need to have to fix things:

This system did a lot of logging, so I made sure to record what I was doing, what data I was using, the timestamps of when things went awry, and had all the log files collected together to turn over to the development group.

Having been the victim of inept testing I also know the number one thing not to do: If a bug has ripple effects, don't document the ripple effects as separate problems. This drove me insane when testers did it to me; when something breaks, fine, write it up, but don't keep trying to follow your test procedure exercising what you just discovered to be broken functionality! Each problem that gets written up has to be dispositioned, which takes time and effort, and there's no value in documenting and dispositioning each of the myriad and marvelous ways in which failure manifests itself. The tester is wasting time documenting things that will all go away once the root problem is fixed, and while bogus problems are being written up, the looking for more actual problems isn't happening.

A few weeks after I came onto this project the developers all pretty much stopped arguing with me.

Eventually I moved on from that project and got tasked to "web enable" a large simulation system (ah, back to development!). The first thing I start thinking about is how to go about embedding a web server within this system and how it's going to communicate with its clients, and then right on the heels of that began worrying about everything that could go wrong and how I'm going to handle those situations. What if the server goes down in the middle of a transaction? Or the client? Or the network drops out? How will shutdown take place in a half-hosed environment, for both client and server? How about recovery? Restart?

Dinosaur programmers spend far more of their time dealing with cranky software than they do with software that just works. Their jobs usually is, in fact, converting bad software into stuff that just works. And it's not AJAX, or dynamic typing, or service-oriented architectures, or parallelism, or the latest/greatest programming language that's going to make that happen.

It means taking a cold-blooded, hard look at software development practices and acknowledging the fact that writing brittle, fragile software is easier, and so that is what is done. The programmers who come to terms with this reality gain a fine appreciation for high function software that "just works", and so are unimpressed by whatever hot new technology is on the bleeding edge and is prognosticated as going to "make writing software a breeze". Because it won't.

I once fought a programming language war against two system engineers who wanted us to recode our completed, reliable, correctly functioning command and control system in C++, for no other reason than "that's where the market is going." Less than ten years later anyone citing that as a reason for such a recoding would be laughed out of the office.

Every new technology sounds great, and is capable of great things when it's optimally applied. Which it almost never is. And nothing brings out the imperfections of a new technology like real world usage, in a mission critical environment, with money on the line.

So you'll forgive me when I don't get excited about your new framework/language/architecture/process that's going to change everything/usher in a new paradigm/launch the next Google. As long as people are writing code, and assembling component frameworks, and modeling the business processes, they're going to continue to forget, screw up, hope for the best, and ignore the vague and uncertain parts.

While a new technology might eliminate a whole species of software development errors, you'll just discover, and now have the opportunity to explore, a whole new jungle of failure.

Dinosaur programmers already know this, it's our home turf.

Monday, October 1, 2007

Censoring public condemnation of torture??

Bono, of U2, debt relief, and poverty fighting fame, just recently received the National Constitution's Center Liberty Medal for 2007.

The video of the event, which obviously includes Bono's acceptance speech where he deplores torture and its acceptance by a significant fraction of Americans (38%) has been edited to have his condemnation excised.

WTF?

Updated...

The complete, unedited version of Bono's acceptance speech is now available at the Liberty Medal web site.

Okay, I've seen the explanation, but there was certainly something odd about the original availability of the video.

Chicken, and just Chicken.

No "Duck".

No "Goose".