I recently went to a training day where a very earnest presenter tried to tell us that his software could do amazing things. And there is no doubt we were amazed, but perhaps not quite in the way he was hoping.
That the software was glitchy became apparent very early on, as it failed to do a number of the things he was trying to show us. He clearly had a set of carefully scripted demos that he knew would work, but even those produced somewhat unreliable and unpredictable outcomes.
What really had me going for his throat by the end of the day, though, was the number of times one of us would have a problem with the software and he would greet us with “oh, that’s because you’re doing it wrong.”
And no, I wasn’t grumpy because I am a grammar nazi (although I confess I do have tendencies in that direction). It was the overwhelming implication that we should be changing our way of working to suit the software. We should not expect to be able to do things the way we have always done them. We should alter our entire workflow in order to do precisely what the software wants, the way the software wants it – whether that meets our needs or not.
There, in a nutshell, is the sheer arrogance, ignorance and, yes, malpractice of the computer industry, all wrapped up in one neat little package.
It’s not our fault. You’re just doing it wrong.
I have a PhD in Computer Science (in usability, in fact), and I have lost count of the number of technical support people who have tried to talk to me as though I am a 3 year old standing near a wall covered in texta, feigning ignorance. It’s always “What did you do??” in weary, exasperated tones.
Even with my relatively high level of technical understanding, it can take forever to persuade them that it wasn’t some dumb thing that I did, but that there really is a problem with the technology. How does the average user cope with that? Usually by assuming that they did, in fact, do something wrong.
If I only had a dollar for everyone who has ever told me “I’m no good with computers”. I always give the same answer: “No, computers are no good with you!” The best tools do exactly what we need them to do, without interfering with our workflow at all. We don’t need to compensate for them or understand them. They just work. Software all too rarely fits into this category.
For a simple example, picture a door in a public building. It can only be pushed, not pulled, and it has no door handle. Just a flat plate where the doorhandle would normally be. The whole “pull/push” quandary is missing with this door. You can only push it. The very design of the door says “push”.
Of course, software is intangible and conceptually complex, so it’s not easy to make it as obvious as a pushable door. But think of the number of doors you come across in your day that can only be pushed, yet have pull-able handles on them. They need labels to tell you what to do – and who stops to read labels when walking through doors? As a simple rule, if it needs instructions to tell you how to use it, then the design isn’t good enough. Good design is distressingly rare.
In the software industry, it is all but absent. In a way I don’t argue with that – as users, it is up to us to do our homework and choose the most effective and usable piece of software we can get. In general we don’t – we buy the most effectively marketed, the cheapest, or the most famous, for which we mostly have ourselves to blame. Caveat emptor and all that. But what I do find incredibly objectionable is that culture of blaming the user when things go wrong. A little respect would be a fine thing.