Strongly Emergent

What comes from combining humans, computers, and narrative

Matters of Ethics

One of the themes that I’m developing here is that technical skills aren’t everything. I blog at all because I believe that communications skills, especially the ability to craft narrative, are super-important. Today I’m going to write about something else non-technical: ethics. Ethics is a an ongoing philosophical brawl, really, so I’m only going to make one request of my fellow IT people: choose one (1) ethical system, make sure it’s internally consistent, and then stick with it.

I urge you to do this because even if they seem preachy or nosy, most people don’t actually care about the specifics of your ethical system. If your behavior - not what you say, your behavior - indicates that you have an ethical system and that your actions are predictable, generally people will be fine. Predictability equals safety. This is why, as a cynically pragmatic matter, I don’t advise deception. For one thing, it’s far too easy for others to mistake a deceptive answer to a technical question as being wrong. This is to be avoided. For another thing, a more important thing, if you give a deceptive answer, you commit to keeping that answer consistent with the world - and that’s an enormous task. You’re either going to have to adjust the world so that it conforms to your answer, or scale up your answer unsustainably. Deception is much, much harder than it seems.

But take heart: you’re not alone in wrestling with ethical issues, and you may take some cheer in noticing that other people are massively screwing up in their answers to what should be easy ethical questions. For instance, the system administrator of Harriton High School in Pennsylvania was spying on students via webcams. That’s an easy one to avoid. Don’t do that. Another individual was advancing his career with a fictitious alter ego who sold a bad software product and gave his other self exclusive interviews. Avoid that too.

On an organizational level, a treaty is currently being contemplated that would allow the RIAA/MPAA to veto virtually any content on the Internet. I urge you not to do that either. Even if you don’t agree with me on that, the government is claiming massive surveillance powers through IT, and it behooves you as an IT professional to know about these issues - and to care about them.

As an IT professional, you will have to deal with these. When someone wants to use IT to serve the implementation of a really bad decision, you the IT professional will be called on to do that implementation - and you need to know, ahead of time, what you’re going to do in that situation. That’s why you need an internally consistent, reliable ethical model. If you improvise in a situation like that, the risk of Bad Things happening goes up by quite a lot. For your own safety, you want predictability. Don’t be caught flatfooted.

Seth Godin has a comment that I’d like to repeat - I think it speaks to a common failing that we may succumb to when we have to make decisions like this. It applies to far more decisions than the ones I’m talking about, but it’s highly relevant here too.

People are just begging to be told what to do. There are a lot of reasons for this, but I think the biggest one is: “If you tell me what to do, the responsibility for the outcome is yours, not mine. I’m safe.”

When asked, resist.

Go forth and accept responsibility. That’s what IT people do: we’re responsible for IT.