Monday, December 17, 2007

UI inertia

I just had an irritating experience on a major retail site. It doesn't really matter which one, or exactly what problem, but here's a brief summary of the case in point: Fairly early in a several-step checkout process, I entered a new form of payment. Then I realized that I had to correct the shipping on one item. No problem, there was a button for that.

A few steps later, I ended up on a page asking me to enter a form of payment, which page seemed to have no memory of the new one I'd just entered. Or anything else I'd ever used, for that matter. Or a button to take me anywhere but forward. So I finally ended up starting over, losing most (but not all) of the other information I'd put in.

Whenever something like this happens, I make a mental checklist. Did the system have all the information it needed to let me fix the problem without losing what I'd put in? Would it have had to guess my intentions? Could the problem have been solved much better with known technology? In this case, and so many like it, the answers are clearly yes, no and yes.

Why does this keep happening? Why do we as an industry seem immune to experience? It's not from lack of trying. From what I can make out, having made most or all of the mistakes myself at one point or another, the cycle goes something like this:
  • The application needs some feature. It might be a shopping-cart UI, or a way to remember configuration, or a database, or whatever.
  • Early on, the requirements don't seem that demanding, and it's crucial to Get The Thing Out The Door. So someone puts together a good-enough first cut.
  • Pain results.
  • For most of the perennial problems, this happens again and again, leading people to develop toolkits. Typically each house grows its own.
  • More ambitious and successful houses venture out to bottle and sell theirs.
  • Again there is pressure to Get The Thing Out The Door (the toolkit this time), so the new solution solves just enough problems to constitute a clear improvement. The state of the art advances by a modest increment.
  • Except that all the apps that had to be pushed Out The Door before the toolkit and its improvements came along is already out the door and thus massively harder to change.
  • As a corollary, more quickly successful products tend to have clunkier interfaces, as there was less time to change them before they became hard to change.
  • Finally, a certain number of houses won't use the latest stuff anyway. Instead they'll use older stuff or roll their own, for a number of reasons, some valid and some not so valid.
It's not impossible to clean up something that's already out the door, but it requires a special blend of skill and patience. It's hard to make a business case for fixing something that doesn't appear badly broken. Generally it will require an upstart competitor to change the risk/reward balance.

In the particular case of web sites, the path of least resistance has been the "fill out a form, push a button, fill out the next form" routine, with a cookie or two to keep track of how far you've gotten in case you need to break off or backtrack. Even that may be ambitious. There are still surprisingly many sites that will make you re-enter an entire form if you mess up one field, or make you start from scratch if you have to go back to step 1. This is far behind what available tools will support.

I'm not entirely against step-by-step processes. Sometimes they work better than alternatives like "tweak stuff until you like what you see and then press 'go'". They at least leave little doubt as to what to do next. Which combination of approaches to use when is a matter of skill, taste and empirical testing with real people.

Whatever the approach, there is always a surprisingly high portion of stuff out there that just seems like it ought to have been better, that has problems that were identified, and solved, ten or twenty years ago. It's easy to conclude that people just must not know what they're doing, but I don't think that's a big part of the story. Rather, there seem to be fairly strong forces (in particular the door-outward force) tending to allocate resources to ensure that the end product is just good enough, and no better.

One of the best takes on this I've seen is in Richard Gabriel's classic Lisp: Good News, Bad News, How to Win Big. Gabriel is mainly talking about LISP, and he makes a lot of good and interesting points. In section 2.1, "The Rise of Worse is Better", however, he argues more generally that while we may want to do The Right Thing, a system that doesn't has much better survival characteristics. To throw a little more fuel on the fire, Gabriel's canonical "worse is better" system is UNIX. Naturally, it's section 2.1 that everyone quotes.

No comments: