Thoughts On ...

spacer
Photo credit: Pat Sarnacke

April 30, 2011

Software is not an asset

Software is not an asset - I know everyone thinks it is, but its not; its a liability - its very existence generates cost. Every minute someone spends coping with it in order to adjust the behavior of hardware (which is an asset) is a net-credit on the balance sheet and a decrease in profitability. A necessary expense in many cases, but a cost nonetheless. Getting rid of it goes directly to the bottom line.

Software isn't a thing - its a form of communication. Just as no one would list meeting minutes, or recordings of the CEO's speech as company assets, software should be seen as stored communications (with hardware). Its value should be viewed in how it changes behavior; its very existence should be seen as creating risk, and its deletion as key to a healthy techno-ecosystem.

This is the essential argument behind why refactoring is worthwhile, and why its value isn't really comparable to writing code. Its also why its hard for non-programmers to see that value - because they think of programmer time as a credit, when really its can also be a debit.

My advice to managers (and everyone else): Want a sure fire way to add value to your organization? Reward and encourage deleting of unused code; fight for and insist that programmers be given time to refactor and otherwise eliminate unused code, and don't account for time spent doing so in the same way you do the creation of code.

My advice to programmers: Delete unused code at *every* opportunity and scale (function, class, file, project, organization); don't compare the effort to do so to writing code; work at all times to have your software feature-set reflect its actual, operational use.

Posted by wcaputo at 1:40 PM | Comments (1)

April 17, 2011

Oil 101

As I watched the Deepwater Horizon oil spill unfold last summer, I found there were many fundamental questions about the oil industry that I didn't understand. So, I started reading. I read many websites. Of these, The Oil Drum was probably the most useful, but I was still struggling with some basic questions like: "How does a country benefit economically from letting an independent company take oil from its shores?" - and the sheer volume of contradictory opinion, punditry and proposed solutions, led me to conclude: "I need to dig deeper."

So, I bought a book, Oil 101 as it got excellent reviews, and seem to cover everything about oil from chemistry & history, to exploration, production & trading. I recently finished this book and while its not a quick read, it is very well-written, interestingly presented and comprehensive. Written from a neutral point of view in that it describes what is, not what ought to be, it gave me just what I wanted: a good grounding that I can now use for further education, and evaluation of where we're headed.

Unfortunately, the book has created a new problem for me: Nearly everything I now see being presented on this topic (again the Oil Drum is a notable exception) seems, simplistic and uniformed, aimed at supporting one political agenda or another and/or just plain out to lunch.

To take just one example: I stumbled across this article about how the Koch Brothers are manipulating the oil market.

Now, I'm no fan of the Koch brothers - particularly their support for the anti-union movements and the big sleight-of-hand trick that is the tea party - but this article presents 'facts' and jargon, not to educate, but to scare.

Take 'contango.' Investopedia defines it as: "When the futures price is above the expected future spot price. Consequently, the [futures] price will decline to the spot price before the delivery date." Got that? Simple, right? Riiight...

To make any sense of Contango, you need to understand the oil futures market (not to mention futures themselves), the spot market, global supply and demand (both now and at the time of the contract) how oil is delivered, etc, etc - all stuff covered in Oil 101, but curiously absent from this and every other article I've read that mentions the term - probably because its taken me months to build even a basic understanding of the topic, so an article can't even come close - even if the article writer understands it. Instead its summarized neatly as: "demand is expected to outstrip supply." Wow, so simple a child could understand it. Clearly those who use the term 'contango', just want to hide how simple it is in order to steal from the poor.

Oh, and the gist of the manipulation? The Koch's are artificially reducing supply by leasing four supertankers and storing oil in them (presumably because they are using Contango against us). Just how much supply have they deprived the world of by doing this? Let's do some math (all figures from Oil 101, which is about 2 years old now, but it'll get the point across):

The largest supertankers (unlikely that's what's being used here, but we'll go with it): are called ULCC's (Ultra Large Crude Carriers) and they can hold up to 560,000 DWT (that's Dead Weight Tonnage). Again, unlikely to be filled to the brim, but we'll use the number as is.

4 ULCC x 560,000 DWT ≈ 2.25 million DWT
7.5 barrels per DWT ≈ 16.8mm barrels

Wow, 16.8 million barrels of oil... that's how many days supply?

Not even one. Not even one for the US! In 2008, the *daily* consumption for the US was 20.9mm barrels per day (bpd). How much did the world consume in a day? 85.7mm bpd.

16.8 million barrels isn't even enough to feed all of 2008's US daily reactor capacity (17.5mm bpd). In other words, the heinous manipulation being described in the article - at worst - is shifting supply by a day.

In truth, its not even close to that, because ULCC's aren't likely to be used for that, not all of the DWT is actually oil, the oil being stored may not service as much demand, etc, etc.

So why is Koch industries spending money to lease tankers to store this oil? One guess is that its a hedge. They run refineries, and certainly buy oil to feed those refineries using futures contracts. By storing this oil, they can hedge those contracts with physical oil (much like one can hedge a stock futures position by holding actual stock). Another, possibility is that they are hedging against a short-term supply shortage for input into their refineries (e.g. maybe a couple weeks worth).

Not so evil sounding, once you put some numbers behind it. huh? Actually, it sounds to me just like the descriptions in the book of how global supply is stored and moved so we don't see occasional spikes of $15.00 gallon gasoline in the summer. In short: The refineries are doing their job.

Now as I said, the Koch brothers seem to have a lot of blood on their hands - but articles like the one linked, make that hard to spot when they simply stir up the rabble with BS. The sheer size of the oil-based economy is incomprehensible, the numbers are enormous. An oil tanker-full sounds like a lot, but it isn't even the proverbial drop in the bucket.

Reading this book has made it impossible for me to see the oil problem in simple terms. Nor are any of the "simple" solutions to it... simple. I'm not even sure 'problem' is even the right word anymore - it's like saying 'life' is a problem to be solved. True I suppose, but not very enlightening. No, oil is deeply entwined into the fabric of our lives, and as its details change, our lives - all of our lives - change with it. No-one - or everyone (take your pick) is to blame for that, it just is. Oil magnates and speculators make such great scapegoats - they're rich, remote - and very likely using their power for their own gain, but hanging them in the square won't fix things. Understanding the facts, so we can make our own decisions about how they are using their power (both right and wrong) will.

We all owe it to ourselves - and each other - to learn as much as we can about how our world works. I've joked that I'm one of only three people who've read this book that aren't in the oil industry (me, the book's editor, and proof reader), if you care about this topic - and want to make your own educated decisions about what's really going on - without resorting to simple solutions like boogeymen. You should join the three of us in reading this book.

Posted by wcaputo at 11:15 AM | Comments (0)

February 26, 2011

Introducing Situational Context

(My talk at SpeakerConf was about something I've dubbed "Situational Context" and why its important for structuring work among groups trying to accomplish something technical. This material is the first part of that talk)

Situational Context is the local knowledge within someone's work space.

Now there is a branch of philosophy devoted to Knowledge called Epistomology. It turns out that knowledge is a funny thing. It seems obvious to us that we know something, and yet an exact definition has eluded philosophers for millennia - going back at least as far as Plato.

To summarize the common (problematic) definition: We know something when a) Its true b) We believe its true & c) We are justified in believing its true.

It's that last bit that causes all the trouble.

As you might expect, much of the debate has turned on what is "justifiable", but in 1963 a philosopher named Edmund Gettier showed that even if we are justified in believing something that is actually true, we still may not really have known it (not only is Gettier's paper pretty compelling, its short, interesting and pretty easy to follow - what a rarity in a philosophy text!).

Further, there are different types of knowledge: scientific, partial & situational (among others). Scientific knowledge is the most sure, but much of our knowledge about the daily world around us consists of the other two.

So what does this have to do with technical work? Well, if its not entirely certain what knowledge is and much of the daily knowledge about what people do is partial & situational and yet we tend to believe what we know is obvious, then clearly figuring out just what a piece of software or hardware should actually do is not as easy as it might first appear.

Technologists need hard unambiguous facts to successfully implement systems (my theory as to why so many of us are pedantic), but most of the knowledge available to us is partial and/or situational. Further, many (most?) decisions are made using something called Bounded Rationality (a topic for another post) meaning that at any given time people are making decisions (and changing what's known) using the partial situational knowledge available in their... context.

And it is this Situational Context that we need knowledge of to do technical work successfully. The problem is: that knowledge is only really available to those within the context. Whenever we try to move it out, it costs us - at least the time spent in informal conversation, but more so in actual meetings and even more if we attempt to write it down. It gets even worse when we introduce managers, PM's, analysts and others in a ever-expanding game of telephone that forces us to go round and round seeking the truth.

The XP movement made a point of emphasizing that having customers and programmers working closely together greatly reduced the cost of writing software. Like other early messages of XP (again a topic for another post) this has been diluted by the Agile diaspora, and so now its much more common to see programmers working from story cards that either attempt to render explicit the situational knowledge of the customer or (at best) serve as placeholders for a conversation for transferring that knowledge. As agile techniques infect IT, this (anti?) pattern will likely spread to other forms of technical work.

I think we can do better.

This transfer (however accomplished) crosses what I have termed the Situational Context Boundary and like any other context boundary, cost goes up exponentially when you do this. Story cards, iterations, etc reduce the cost relative to other techniques, but I've begun to think we're optimizing the wrong part of the process.

What if we kept the work inside the boundary? Then presumably costs can be kept lower. How? Well for those within the context, terms can remain undefined because everyone just "knows" what they mean allowing whomever is available to simply do the work. Edge cases can be safely ignored because everyone knows they won't be encountered. Features can be anticipated because we know what people are actually doing with the system. The list goes on and on.

This is more than just Daily Customer and it plays out at all levels: Between end-user & programmer, between programming teams (e.g. shared services) and even within them (between pairs & individuals). Its true within IT departments too - especially since they tend to be organized into silos around parts of the whole (e.g. hardware, networking, sys admin, etc).

If people are mostly working within a context, you can reduce or eliminate lengthy conversations to share that context. The trap is that people think they are reducing cost by separating the work, but I posit that any time we save by doing so is lost (with interest) in the endless questions, discussions, rework, loss/gain of trust, maneuvering and just-plain politicking that results from us knowing a hell of a lot less about one another's situation than we think we do.

In short: Be aware of when you are moving effort across a situational context boundary, because the cost of getting things done is going to increase as people switch from working to talking about work. Look for ways to keep the work within a context while sharing effort among many participants.

In a future article I plan to talk more about techniques for doing so.

Posted by wcaputo at 10:48 AM | Comments (3)

April 22, 2010

Could OSS withstand the attack of the GMO's?

UPDATE (April, 23 2010: 21:45 CDT): Another thing I would like to add: I have been experiencing some fairly serious health issues recently due to poor eating habits and food choices, so food quality and health are on my mind a lot these days. I.e. having choices and the ability to know exactly what I am eating is not just of passing interest to me, but critical to my ability to get and stay healthy. So, I've decided to support the "Food Democracy Now!" right-to-know campaign as I believe the critical threat here is to our freedom of information and freedom of choice (for both farmers and consumers) as all other issues surrounding our food supply devolve from there.

I just signed a letter to FDA Deputy Commissioner of Food Michael Taylor and USDA Deputy Secretary Kathleen Merrigan asking that they not stifle international and U.S. regulations regarding the labeling of GMO food. It's important that American's know how their food is grown. We have a right to know what's in our food and the U.S. government should not be working to stifle these basic rights.

If you share my concerns, please consider signing their petition:

action.fooddemocracynow.org/cms/sign/stop_the_sneak_attack


UPDATE(April, 23 2010: 12:11 CDT): added link to the documentary I am referring to below (Food, Inc.)


(GMO and OSS - depending on who you are and how you got here, one of these acronyms might not mean much to you, while the other one is very familiar. My goal here is to show that you actually know more about both of them than you realize)

This started as an attempt to understand what I was feeling after watching the documentary Food, Inc. last night - particularly how the issues it raised are similar to software and techology issues (though a lot scarier and much less talked about).

One of these is Genetically Modified Organisms or GMO's. To show the parallels, I've written the following fairy tale (set in the not-to-distant-future) and I'd like you to imagine yourself as our protagonist:


You control a moderately large and successful software security company. One day, you develop a game-changing product, Knock Out Pro ("don't discriminate... eliminate!") that permanently disables any computer it is used against. You successfully market it to governments and big companies ("wipes out botnets, zombie pc's, illegal content servers and all manner of subversive and undesirable computers on contact!") as well as individuals (for "household use only") and it gets widely deployed and used world-wide.

Yay. You're rich. Undesirable computers are being wiped out. You're a hero... and the only game in town: you scored a US patent for your product's innovative ability to do what it does and new cooperative trade agreements have you covered across the globe.

Except - there's this nagging potential for collateral damage when customers actually use Knock Out Pro. (most of) Your customers would prefer to wipe out only bad computers. Problem is, your product is effective precisely because it doesn't discriminate, it eliminates, remember? Your unique, patented genius is precisely that you left all that pesky target evaluation to the customer and focused on removing the target effectively. However customers are spending a lot of time, effort (and money) to ensure they don't target good computers (at least ones they care about) when they use your product and so they're not entirely satisfied.

That's big money being wasted (so far). Wouldn't it be great if they gave that money to you too? Wouldn't it be great if you could just target a whole network, and only knock out the bad computers? But you've been in this industry for a while. You already *have* products that selectively disable computers and they work... kinda. Trouble is, it is really hard to spot a bad computer in a general way: The R&D is prohibitively expensive, the bad computers keep evolving and there's just so damn many different types of undesirable computers out there, that its a much less attractive market (look at how the anti-virus market is going).

Then it hits you: You'll start your own computer company and build computers that are immune to your product! Then your customers (your not worried about anyone else here) will be protected from... well you. But where to start? Well, how about an existing computer design - you could buy one, but that'd be expensive. Besides, computers have been around for a long time now, and there are all these open (i.e. publicly available) system designs and indeed implementations (Open Source Software and hardware and what not) all over the place. You never really got why anyone would freely share something so valuable (bunch of stupid commies) but hey their loss, your gain right? You grab one of those and get to work.

What your R & D team determines is that all it'll take is a relatively, small and easy change. I mean compared to building an entire working modern computer system - all you'll have to do is add in the bit that counters your other product and unfortunately, there's already a fair bit of research available out there for that (in fact you're already engaged in lawsuits trying to supress that information from getting out... thank god for the DMCA).

Wow, in no time, you'll have a whole array of computer products that meet an increasing number of you biggest clients' "good computer" needs. You thought you were rich before? This is going to be Microsoft money, baby.

But wait, now you have a new problem: the change is relatively small and easy, right? The system you're adapting is a publicly available design, those security "researchers" have already theorized about how to block your knock out punch, if you go and release a way to do it, you'll be screwed, right?

Luckily, you already employing a large (and growing) crack legal team and they offer a solution: Patent the countermeasure too! Not one for half measures, you tell them, that's all well and good, but you'd like it better if they just went ahead and patented the whole computer...

(...crickets...)

..."well," the lawyers reply, "That's gonna be kinda tricky. See there's this thing called the 'GNU Public License' and the countermeasure well, its a modification to this computer operating system called Linux, and the license does not permit you to patent Linux (umm, because you didn't create it). The hardware specs have a similar restriction."

You reply that you don't care about tricky, you care about results. And so they go to work as do your lobbyists, and lo and behold you get a shiny new patent on your original computer. Granted, there's still this GPL license to deal with, but with the patent, you figure that you might even be able sue those who wrote it in the first place. And so you build the computer, your customers eat them up like hotcakes, and you step out into grandeur beyond your wildest expectations...

Five years later, and my how things have changed. Sure the big tech companies are still around, but you're in the catbird seat. Some licensed your patent right away. Some (including the linux community) fought back of course, but you've tied that up in the courts for years to come, and now that the founding partner of your favorite law firm was just appointed to the US supreme court, the outcome isn't really in doubt. And sure, people don't "have" to buy your computers, but the are increasingly few places where knock out isn't being run what with the passage of the bipartisan bill: OCTPPCFPA (Online Child, Terrorism, Porn, Piracy, Copywrite Freedom Protection Act) and you have a small army of private investigators working night and day to figure out how those few businesses that aren't licensing your patent are getting away with it.

In short, the general consensus (according to anyone who matters) is that technology is entering a golden age - there are almost no bad computers any more... at least in the US and even the movie and music industries are starting to think the worst might be over for the whole piracy thing. The Internet-wild-west days are finally coming to an end and everyone has you to thank for it. Granted, there are some 3rd world countries where things are still wide open (and worse your products have been pirated), but fortunately they are largely walled off from the networks used by the civilized world... for now. Granted, the threat from them is real, and must be dealt with..., but seeing as how you've got the inside track on becoming the new cyber security czar (after all you are the expert) the future is looking bright indeed...


Just a fairy tale right? An obviously transparent and unrealistc protection racket that no one would ever go for?

Well, then reconsider our little fairy tale, but this time, instead of computers think plants and instead of "anti-botnet software" think herbicides, and instead of patented computers think patented genes.

That's right, plants that humanity has cultured and shaped for thousands of years (what's more open source than that?) have, in the last 20 years become proprietary and monoculture. Don't believe me? Watch the movie. It names, names. (I don't have that kind of legal budget). Or go read this: to see what's happening right now.

In short, the real harm I see in GMO's is not the health concerns, and other bugaboo disinformation that one generally hears in what little public discussion goes on about this stuff, but in the much broader and fundamental risks that what once was the Intellectual Property of the human race - the result of 10,000 years of cultivating crops - and all the value created by it, as well as the freedom of our farmers, and those who buy what they produce (i.e. ultimately all of us) has been taken right out from under noses, and those doing the taking are defending their actions on the grounds that it would harm their investments and bottom lines (when they bother to defend it at all, most of the time they just push everyone out of the way).

Frankly, until yesterday I would've thought my fairy tale was the more plausible scenario.

spacer
Posted by wcaputo at 1:05 PM | Comments (5)

January 19, 2010

Installing MIT Scheme on 64-bit ubuntu

A quick note on installing MIT Scheme on 64 bit ubuntu (which I was doing because I'm diving into SICP again) -- you can't, well you can, but you may run into an error because libmhash.so.2 is missing. This happens because there is currently no 64-bit version of MIT Scheme (though one is coming).

If this is your situation, here's what you can do:

  1. Install MIT Scheme. Follow the install instructions here. If you try to run scheme from command line you should get an error saying that it can't find libmhash.so.2
  2. Install get libs
  3. Now use getlibs to install the ubuntu libmhash packages (e.g.):
    sudo getlibs -p libmhash-dev libmhash2
  4. Type scheme at a command line. You should get a prompt

That's it, scheme should work now (or at least start - I haven't moved far beyond this yet). If you try this and run into an issue leave a comment, and I'll update the article and/or see if I can help.

spacer
Posted by wcaputo at 2:58 PM | Comments (0)

January 1, 2010

Teaching Magic

“Witchcraft to the ignorant, …. Simple science to the learned.” -- The Sorcerer of Rhiannon, by Leigh Brackett
Happy New Year everyone! Before reading further, you should check out this encouraging article on US computer education and at least some of the discouraging comments that follow it. Discouraging because they reinforce the notion that we have a "shop class" mentality in this country when it comes to education (and not just computer education, though that is what I'll focus on).

Comparing the US primary and secondary computer curriculum to "shop class" is an insult to shop classes - not just because computer curriculums aren't any good at teaching a trade (though that's part of it), but also because "learning a trade" should not be the goal of teaching computers in school anyway.

Tellingly, the phrase that drew the most negative comments was the quote from Janis Cuny, program director from the NSF, stating that the goal is "to teach the magic of computing." That line is what gave me hope that curriculum planners might finally be getting it; because those who really know computing know that this stuff *is* magic - the kind you can actually do.

However a world with too much magic and not enough knowledge is very risky and dangerous - ask anyone who weighs the same as a duck.

This isn't specific to computers of course. As has been said before, Magic is simply craft from the perspective of the ignorant. The concept of magic can't exist without ignorance. That's why encountering something magical evokes two otherwise opposite reactions in us: A sense of newness and wonder and a sense of dread and fear. When you consider what technology is, both reactions seem so natural as to be almost biological.

Technology (craft/science/etc) is leverage - it is fundamentally the knowledge of how to do something that before was difficult or seemingly impossible. Its not the goal that's new, its the technique or the mechanism. We ate before fire, we moved before the wheel, we killed before we invented swords. But we ate better after harnessing fire, we moved faster with the wheel, and we killed... well we seem to always have been good at finding ways to kill, but nonetheless, its not the what: the eating, moving or killing that's new, but the how.

And doing things more efficiently (essentially what leverage is) confers a survival advantage for those who have it - they can do things better, faster, cheaper, etc. This makes it is easy to see how those ignorant of new technologies would find them magical and both terrifying and wondrous at the same time. "They do the impossible! Run! (But, wait I want one...)"

There is another sense where technology is leverage: It can result in a tool (talisman) or technique (incantation) that others can use to gain the benefit without requiring all of the knowledge needed to create it in the first place. This is efficient because many individuals in a population gain the benefit of a small number of individual's effort - and its been a key fuel in the rise of various cultures and societies. But there are limits to this second form of leverage: If the talisman (tool) breaks, or the incantation (technique) is used in a different context, the benefit is lost, and the user - not having the actual knowledge - is stuck.

So, as long as there has been technology, there has been those who can create it and those who can only use it. And as this is an inherent benefit of technology, it is not something to be done away with, but to be balanced. We in the US are woefully out of balance.

Which brings me back to the article. Computer technology is a lever that allows us to accomplish goals that were previously very expensive, difficult or seemingly impossible. To those ignorant of how it works (all of us at some point), it is truly magical and that either frightens us or fills us with wonder (or both).

The time has come to stop quietly accepting the mistaken - yet popular - notion that education is primarly about getting jobs. The purpose of education is to make us better at using the collective knowledge of the human race in all areas of life. Yes, this includes our jobs, but it also includes raising our children, being informed citizens, managing our households, enjoying our leisure time, and just being good neighbors. Education should be to make us more effective at life. And as a powerful, new and poorly understood technology, computer education should be central to our learning experience, not relegated to some goofy-ass 'business education' electives that look like rehashed typing classes or arcane math electives that only nerds would sign up for.

Instead, we should be nurturing that sense of wonder - and lowering the fear. Explaining the basics (its what BASIC was created for in the first place) so that computers aren't mysteries but tools. To do this, the direct goal of primary computer education can not be seen as preparation for an IT-related job, but lowering the average amount of ignorance regarding how computers work so that in every facet of life where the talismans and incantations of computers are present (most everywhere) there is less mystery and more knowledge of how they work. The result is more appreciation for their capabilities (and their limitations) so that everyone can use the technology to do whatever task they are doing more efficiently.

That some will get the bug and want to create those talismans and incantations themselves is certainly a worthy secondary goal, but (and in this I speak from experience) we don't need to make that an explicit goal. For those who are willing to put forth the effort, the lure of working magic is its own goal - it doesn't need to be imposed on them. That's not to say that teaching computing (esp. programming) as tradecraft is a bad idea, but it shouldn't be the focus of general primary and secondary computer curriculums. *Everyone* needs to learn the fundamentals, right in the core curriculum along with math, science, reading and history.

We remain a culture of ignorance at our own peril. History shows us what happens when knowledge retreats and we become content to simply use the magic: first the few begin to control the many; then the many begin to first fear and then hate the magic and those that understand it; fewer and fewer know the technology and so it is lost as the talismans break and the incantations are forgotten. We have a term for this process: its called entering a Dark Age. Here's to hoping that in 2010 we move toward the light.

Posted by wcaputo at 12:28 PM | Comments (4)

December 9, 2009

Why Finding a Good Programmer isn't Enough

It is a habit of mankind to entrust to careless hope what they long for, and to use sovereign reason to thrust aside what they do not desire -- Thucydides

UPDATE: (This article on being action oriented in startups mentions its better to hire a lot (recognizing and letting the bad ones go quickly) than to analyze carefully and only choose the good. On the face of it, this seems to be counter what I'm saying here. However, the reason why hiring those talented, but motivated primarily by their desire to see their tech vision realized (rather than your business plan) is that it is really hard for non-technical people to recognize such people as bad hires until they've failed to deliver on the plan. They'll seem like they're really doing good. They'll tell you all about the great plan, and how it uses the best solutions that [insert hot tech company here] used to win, and so forth -- you just won't see your plan move forward while they find the best way to integrate cool scripting language du-jour into their system. The solution is in the article: force them to deliver results quickly and fail fast. The truly good ones will welcome the iterative nature of your business plan, the others will fight against such accountability. Boot them.)

A recent tweet led me to this article by Daniel Tenner from a couple of years ago. It outlines some things people (particularly non-technical people) should look for in order to hire really good programmers.

I think the list is dangerously misleading, because the items in question while possibly* necessary are definitely not sufficient. Worse, a non-technical person who follows this list - and chooses the wrong person - will hire someone who, not only will fail to deliver value to the project or company, but also someone with the capacity to do the endeavour massive harm; far more damage than what an average, or even poor programmer could cause.

Why? Because the list may enable one to identify smart technology enthusiasts, but it will not help with finding someone who delivers great results. To get that, you need someone who won't put gratifying their ego ahead of team and business goals - and unfortunately the list Daniel provides not only describes good programmers who deliver results, but also smart, self-centered egotistical ones too.

To summarize the list Daniel feels that "good" programmers are ones who are smart, passionate about technology, love to program and learn on their own, would (and have) done so outside of work, are knowledgeable about many disparate technologies and uncomfortable using technology he/she does not feel to be "right."

See the problem? What's missing: Will care if your project succeeds. What's present: Does this for his own ends. So what happens when the business plan calls for the "wrong" solution (e.g. because the "right" will take too long, a common problem in start ups - particularly when a less-than-optimal solution was proposed early on and will take too long to change). Without the humility and emotional maturity to compromise his perfect vision the good programmer that Daniel describes might well decide that the business plan is less important than his design vision - and wreck a project that was only somewhat sub-optimal.

To be fair, Daniel is replying to this article by Paul Graham and is attempting to distill his experience with finding good programmers. Paul has his own agenda (as Giles Bowkett points out** here) but both are trying to answer the question: How does a business person find good programmers. In this case, I think Paul's closer to the answer (he feels they can't). Daniel's list seems obvious to him (and to me) - but then we are both programmers. Daniel's answer to this is that he has been on the business side too. So have I - I've managed technical and non-technical projects and people (both well and poorly) and in doing so, I learned an important lesson: Non-technical people not only don't understand the difference between say Java and C#, they don't even understand whether they are comparable - and more importantly they don't care! What they want is someone to implement their business plan. They often want the best (or think they do, which is the same thing for our purposes now), but they assume that best means "best at implementing the plan" - Daniel's list will give them "the best at using the technology." That disconnect carries in it a huge risk; one bigger than hiring mediocre programmers: Because they are smart and so good at using technolgy these guys can and will greatly influence the course of the effort. If they are more interested in feeding their technology passion than implementing your business plan, they are in a position to drive your effort off a cliff, like no mediocre, passionless programmer ever could.

So the programmers business people really want are those really good at shipping technological solutions that implement business plans. How do you find those guys?

In my opinion, there's only two ways to find such talent: Know them already or get lucky. And there is only one reliable way to know them: You or someone you trust needs to have seen them succeed (in business terms) and they need to do it a lot - I'd say at least 90% of the time. You know one of these guys? Hire them - and their network of colleagues, now. Complement them with some junior programmers that know how to learn and want to (i.e. Daniel's list is much less risky for junior programmers, but you're better off finding those that have a track-record of shipping and let them hire the junior programmers).

If you don't know one, you're going to have to roll the dice. But you can still reduce your risk somewhat. And yes, Daniel's list will help you narrow the field. But you still need a sense if they'll care about your business plan. To do that add the following to your interview of Daniel's ideal candidate - it still won't be a sure thing, but it should help: Ask them how their projects went. Did the project ship? How late? How much money did it make? If they were part of a start-up, ask them how it did. If it failed, why did it fail? Then, listen carefully to their answers. If they don't seem interested in such questions, its a red flag. If they only talk about technology its a red-flag. Either way, they are likely to say they built (or intended to build) the ultimate solution. The ones you want to remove from consideration will tell you how all the 'others' got in the way. How management blew the opportunity, how the stupid (they might not use that word, but don't be surprised if they do) programmers who got there before them made all the wrong choices. If they shipped it will have been only because they were able to silence the nay-sayers and take control. The ones you want will mostly have found a way to ship anyway. They'll be focused less on how the others got in their way, and more on how they stayed focused on the business goal and made it a success. The ones you don't want will tell you about how others (almost) ruined it. You may even hear how they finally outmanuevered them and got hold of the reins - but ultimately the stupidity of others will be why he couldn't pull it out and so the product/project/start-up didn't make it. The ones you want will tell you how those things are just a part of shipping software and how the team/company/business (not just them!) found a way to win anyway.

In short: Daniel's list will point you at the technologically talented, but be sure to establish that they also care about your business' success, the risk if they don't is much too high.


----
*I say possibly because I am increasingly of the opinion that when programmers say "good programmer" it means "someone like me." For the record, I agree with Daniel's list - but then I started out on the Commodore 64, will talk endlessly about coding, make a habit of learning about anything and everthing and so on - i.e. that Daniel and I have a similar list may well be because we have a similar background and not because we know anything about being a good programmer.

**I find this post by Giles very uncomfortably true. Which means, you should be looking for my personal agenda in this blog entry, and not for any truth. I'd like to dub this Giles' Rake - because it's easier to see the point in Giles' blog entry than by reading the 700 or so pages of Stephenson's Anathem to get the reference to Diax's Rake

Posted by wcaputo at 10:26 AM | Comments (0)

November 21, 2009

Interfaces with small i's

"`When I use a word,' Humpty Dumpty said, in rather a scornful tone, `it means just what I choose it to mean -- neither more nor less.'" - Lewis Carroll
"Humpty Dumpty was a big fat egg" - Beastie Boys

In reading the interesting discussion between Julian Birch and Ryan Svihla here,

gipoco.com is neither affiliated with the authors of this page nor responsible for its contents. This is a safe-cache copy of the original web site.