Back in the early days of computing, a
software crisis was declared. Projects were being launched with high expectations -- this was back when computers could do absolutely anything -- only to end up late, over budget, disappointingly lacking in features, buggy to the point of uselessness, or not delivered at all.
Many solutions were proposed. Software should be written in such a way that it could be mechanically proved correct. Software engineering should become a proper engineering discipline with licenses required to practice. Methodologies should be developed to control the development process and make it regular and predictable. There were many others.
None of these things has happened on a significant scale. A proof of correctness assumes you understand the problem well enough to state the requirements mathematically, which is not necessarily easier than writing the code itself. For whatever reason, degrees and certificates have not turned out to be particularly important, at least in the places I've worked for the past decades.
Methodologies have come and gone, and while most working engineers can recognize and understand a process problem when they see it ("Why did I not know that API was about to change?" ... "How did we manage to release that without testing feature X??"), there is a high degree of skepticism about methodologies in general.
This isn't to say that there aren't any software methodologies -- there are hundreds -- or that they're not used in practice. I've personally seen up close a highly-touted methodology that used hundreds of man-years and multiple calendar years to replace an old mainframe system with a new, state-of-the art distributed solution that the customer -- which had changed ownership at least once during the wait -- was clearly unhappy with. And well they should have been. Several months in it had been scaled down as it became clear that the original objectives weren't going to be met.
I've also seen "agile" methodologies put in place, with results that were less disastrous but not exactly miraculous either. Personally I'm not at all convinced that a formal methodology is as helpful as a good development culture (you know it when you see it), frequent launches, good modularity and lots of testing.
Several things have happened instead of a cure, or cures, for the software crisis. Languages and tools have improved. Standards, generally
de facto, have emerged. Now that a lot of software is out, both customers and developers have more realistic expectations about what it can and cannot do. Best practices have emerged (Unit tests are your friend. Huge monoliths of code aren't.). Projects get delivered, often late, over budget, lacking features and buggy, but good enough. And it's just code. We can always fix it. I can sense the late
Edsger Dijkstra shaking his head in disapproval as I write this, but nonetheless the code is running and a strong case can be made that the world is better for it.
We don't have, nor did we have, a crisis. What we have is consistent disappointment. We can see what software
could be, and we see what it is, and the gap between the two, particularly in the mistakes we get to make over and over again, is disheartening.
Which leads me back to a persistent complaint: UXen, in general, suck.
Yes, there are plenty of examples of apps and web sites that are easy to use and even beautiful, but there are tons and tons that are annoying, if not downright infuriating, and ugly to boot. For that matter, there are a fair number of pretty-but-useless interfaces. Despite decades of UX experience and extensive research, basic flaws keep coming back again and again. Off the top of my head without trying too hard:
- Forms that make you re-enter everything if you make a mistake with anything (these actually seem to be getting rarer, and a good browser will bail you out by remembering things for you -- and in many cases that's a perfectly fine solution).
- Lists of one item that you have to pick from anyway as though there were an actual choice.
- "Next" buttons that don't go away when you get to the last item (likewise for "Previous")
- Links to useless pages that just link you to where you wanted to go in the first place.
- Security theater that pretends to make things safer. Please make it stop.
- Forms that require you use a special format for things like phone numbers. Do I include the dashes or not?
- Wacky forms for things like dates that throw everything you know about keys like backspace and tab out the window.
- Error handling that tells you nothing about how to fix the problem.
- Layouts that only line up right on a particular browser.
- Pages that tell you to "upgrade" if you're not running a particular browser.
- General garish design. Text that doesn't contrast with the background, which is too busy anyway. Text that contrasts too much. Cutely unreadable fonts. Animated GIFs that cycle endlessly.
- Things that pop up in front of what you're trying to look at for no good reason.
- Editors that assume, a la Heisenberg, that the mere act of opening an edit window on a document causes unspecified "unsaved changes" that you must then decide whether or not to save (yeah, Blogger, you're guilty here).
And so forth. This is just off the top of my head. I've ranted about several of these already, though for some reason the industry doesn't seem to have taken heed.
How does this happen?
How does any less-than-satisfactory design ever happen? One answer is that reality sets in. Any real project is a compromise between the desire to produce something great and the need to get
something out in front of the customer. Perfect is the enemy of good enough.
In an ideal world, people would be able to describe exactly what they want and designers could just give it to them. In the real world, people don't always know what they want, or what's reasonably feasible, and designers don't always know how to give it to them. In the ideal world a designer has at hand all possible solutions and is never swayed by the desire to use some clever new technique whether it really applies or not. In the real world designers are humans with limited resources.
This isn't unique to software by any means. Doors have been around for millennia, and people still
don't always know how to design them.
I should pause here to acknowledge that UX is difficult. There are rules and methods, and tons of tools, but putting together a truly excellent UX that's both pleasant and fully functional, that makes easy things easy and hard things possible, takes a lot of thought, effort and back-and-forth with people actually trying to use it.
Again, though, that's not a property of UX. It's a property of good design. The question here is why are UX things that seem simple enough -- like avoiding useless buttons and links -- so often wrong in practice. A few possible answers:
- Actually, UX designers get it more-or-less right most of the time. We just notice the failures because they're really, really annoying.
- It's harder than it looks. It's not always easy to figure out (in terms even a computer can understand) that a link or button is useless, or how to lay something out consistently on widely different screens.
- The best tools aren't always available. Maybe there's a really good widget for handling a changing list of items that allows for both quick and fine-grained scrolling and so forth. But it's something your competitor wrote, or it's freely available but not on the platform you're using.
- Dogma. Occasionally guidelines require foolish consistency and UX is not in a position to bend them. This may explain some tomfoolery regarding dates, social security numbers and such.
- Plausible-sounding reasoning that never gets revisited. It may seem like a great idea to make sure you have a valid social security number by requiring the user to put in the dashes as well. That way you know they're paying attention. Well, no.
- Reinvented wheels. The person doing the UX hasn't yet developed the "this must already exist somewhere" Spidey sense, or thinks it would be Really Cool to write yet another text editing widget.
- Software rot. The page starts out really nicely, but changes are jammed in without regard to an overall plan. Inconsistencies develop and later changes are built on top of them.
Hmm ... once again, none of these seems particularly unique to UX. Time to admit it: UX is a branch of software engineering, liable to all the faults of other software engineering endeavors. Yes, there is an element of human interaction, but if you think about it, designing a library for people to code to is also a kind of UX design, just not one with screens and input devices. You could just as well say the same things that make UX development error prone make library design error prone as the other way around.
To answer the original question, there is no UX crisis, no more than there was a software crisis. We just have the same kinds of consistent disappointment.
But who asked? Well, I did, in the title of this post. Interestingly enough, no one actually seems to have declared a UX crisis, or at least the idea doesn't seem to have taken off. Maybe we have learned a bit in the past few decades after all.