Unnecessary Explanations

Introducing users to a new app or set of functionality is a difficult task for which there are no easy answers. One of the oldest tricks in the book is to create a kind of instructional screen in which the interface is explained, either diagrammatically or through the use of elucidating circles, arrows, lines and notational text (what Apple has in the past called “coach marks,” a term I haven’t heard elsewhere but that I really like) directly over the interface. The idea is to add a meta level of guidance to help acquaint the user with the key parts of the interface and how to use them.

I’ve been noticing these more and more lately, a trend that I find regrettable. I’ve designed products with instructional screens and coach marks in the past, and they were miserable failures. In my experience, these types of parenthetical interfaces are almost always misguided, mostly because they run up against one of the (nearly) immutable laws of interface design: people don’t read interfaces.


Look and Read Here and Here and Here

Here’s an egregious example, one of a few that I’ve been casually collecting over the past few months. The welcome screen for Richard Branson’s misbegotten Project magazine app is practically a visual assault on the user’s desire to consume the actual content. Where any rational user would expect to be able to start reading after they’ve launched the app, this screen basically insists that you assemble a piece of Ikea furniture before you can get started.

Not every instructional screen is as woeful as Project’s of course. Here is a much more elegant screen from The New Yorker app.

It features dramatically less text of course, but it’s still far too much for my taste. A very skilled or efficient designer might be able to get the reader to read one or two notes like this, but most readers won’t bother reading any of them. What’s more, this is just one part of the instructional screen for this app; users are expected to swipe down for an additional page of notations and even an icon key. Who’s going to bother?

This might sound like I’m picking on iPad magazines yet again but it really does seem that the biggest offenders in this regard are usually print publications making gimmicky leaps to the iPad. What’s inspiring these instructional screens is an ill-informed grasp of how people really interact with digital media, and that obstinate misapprehension is at the very heart of why I dislike the genre so much.

But ‘pure play’ digital apps are guilty of similar transgressions, too. Here’s one from the RSS reader Pulse. Unfortunately, this is as good as instructional screens usually get, which is to say it’s still not very good. There’s too much text here but there’s some attempt at keeping it minimal, and the descriptions are fairly straightforward without a lot of fussiness in the language.

Stepping away from the iPad, here’s an example I came across from the scheduling app Tungle. This product lives in the browser, but applications across all platforms are starting to be governed by the same expectations we bring to iOS apps. The rough lines have a nice humanistic quality, and there’s an earnest attempt at minimizing clutter on the interface, but there’s just too much text here and too many concepts to absorb.

Back to iOS: here’s one from the BlipSnips app, which crams lots of coach marks into a very small space.

The marks and the dark gray overlay obscure the app itself so thoroughly that very little of the actual interface peeks through, which would seem to run counter to the main goal altogether: what a user wants is to attach meaning to the interface, which is hard to do when you can barely see it.

Don’t Say It

The issue of using too much text aside, the fundamental problem that underlies all of these examples is that they just shouldn’t exist in the first place — especially on multitouch devices, which are fully predicated on the idea of intuitiveness. The most important idea at the heart of every iPhone, iPad and, to a lesser but still significant extent, every Android device is that they need no explanation. You pick them up and use them. No no training course, no certification, and certainly no manual. The apps that run on them should be the same way: just launch them and start using using them.

In spite of their best intentions, instructional screens are diametrically opposed to this core idea. Of course, they’re attractive because creating interfaces in the iOS mold — minimal, unambiguous, self-evident — is very difficult work, and the idea of just being able to explain away the ambiguities is a seductive one. But all they really serve to do is call attention to the fact that the interface itself was poorly designed. If it needs to be explained, then it’s probably broken.

Side note: if you’ve come across similar instructional screens, please point me to them or send them to me via email. I’m building up a little collection.

+
  1. It should be about “Show, don’t tell.” The best user interfaces are the ones that are straightforward, common or familiar and when not, gives users a sense of discovery and where possible, delight.

    I remember playing arcade games like Street Fighter II — you weren’t told what the moves were or even what the buttons where — clicking them let you know what each one did and of course, watching others and hearing about the “moves” keyed you in on the killer things you could do with each character.

    I think the key is discovery and rewarding users with that part of it.

    Instructions are useful of course — though I think if you have to have them, you should tuck them away somewhere for “in case sh#t happens.”

  2. Thank you for this roundup. National Geographic on Zinio has one as well, not that the tool is actually complicated – and I don’t think I’ve ever actually looked at it. I seem to recall that the Popular Science iPad app had one as well – but I only looked at the app once because the whole thing was too much.

  3. Hi Khoi. Google Docs has just introduced something similar to guide its users round its reconfigured ‘Home’ screen. You can get at it from the following link: link

    Google gets it a little better by introducing each feature one at a time, avoiding overwhelming you, though I’d still argue it’s way too verbose and you feel like many of the cases above that it’s explaining ‘Here’s what we did’ rather than ‘And this is better for you because…’.

  4. Andrew: good catch, I hadn’t seen that. One more reason why the screens I’ve rounded up here don’t work is that they present everything to the user at once, where a more procedural approach, like the Google Docs example, makes the concept a bit more palatable. I actually have an even better example of this that I’ll blog about tomorrow.

  5. You might also want to take a look at the Rhapsody app for iPhone. In that case the coach marks seem like an admission of failure with the interface of the app itself.

    The ‘Welcome to Penultimate’ notebook in Penultimate for the iPad is probably the most successful that I’ve seen.

  6. I agree that most of the time this is a bad idea, and the examples you gave are pretty hard to look at. Still, I find the Comcast iPad app’s help/orientation overlay (screenshot here) to be pretty nice. It stays on brand, has a little novelty to the style of the callouts. It also doesn’t explain everything, but just the things that they think you ought to try.

  7. Great writeup, and I couldn’t agree more.

    Twitter and (as mentioned above) Flickr have implemented these “unnecessary explanation” pages in clever ways, but I have found them to be more annoying than useful.

    The way I see it, as an online producer, is like this: if a person is passionate about using your application or website, they’ll figure it out. People want to dive in to the content, not fiddle with controls or read the instruction manual (remember when clocks on VCRs used to stay at 12:00?).

    Let people dive into your application, give them an option to view instructions if they’d like, but don’t force it on them.

  8. Fully agreed. This, and the “Help Cursor” approach of Microsoft Windows, are heartbreakingly inefficient attempts to help the user.

    It is often noted that video games in the 1980s and ’90s let the player discover things on their own, either through trial and error or through forcing the next move so as to reveal the depth of controls and interactions. Today, even Nintendo loads many of their games with endless explanatory screens and balloons.

    Granted, two things probably contribute to that:

    1. The audience is less captive – 20 years ago, that one game cost a pretty penny and so once you got it, you damn well figured out how to play it. Today, rather than exploring and learning the game on your own, you can just move on to the next thing.

    2. Today’s games are more complicated. This makes the task of educating the user more challenging – which is, of course, not a permission slip to just give up and go IKEA with it, as you said.

    Your post inspired me to blog a brief thing about how we handled this in our game The Incident. We used simple, graphical in-game hints, and a dedicated Help screen (accessible from the main menu). You can see them here:

    http://j.mp/ehVwNu

    With the Help screen, I thought it was important not to put it in front of the user immediately – it’s optional, and the user has to seek it out. I’d hate to ruin that first-launch experience where the player gets to run around and tap away on their own. We also tried to make it fun by making it into its own little story, distinct in style and tone. Reading this should feel like part of the game, an adventure – not a non-diegetic insert.

    With the in-game hints, we made the basic ones part of the game – you can’t proceed until you do what the hint tells you.

    It’s a huge mistake to show the user ALL the options at once; the best way to learn is to SEE, then DO, so different controls and options have to be shown progressively, letting the user try them out immediately after learning them. Otherwise, it’ll be a miracle if the user remembers one or two of the dozen controls they’re shown on one of these mad-scientist’s-blackboard screens.

    Our game is a simple case, of course, but many apps (and other media) complicate their own simple models needlessly.

  9. This is a great article. Well said.

    Instead of a “how to use this app” model, UX and UI designers can take a cue from video game design and follow a “learn by doing” model, where more functionality is revealed in a natural progression.

  10. Hi Khoi—just stepping back and considering your last two paragraphs and the philosophy that good multitouch applications should never require any training:

    Don’t we think that eventually, there will be complex and powerful multitouch applications that inherently require some level of training? Classifying such applications as inherently poorly designed or broken seems inaccurate in many cases. Additionally, if designers & developers restrict themselves to only easy-to-learn designs, we are further limiting our ability to provide powerful, complex functionality.

    That said, it is always a worthy goal to make an application as simple as possible, and then as usable as possible out-of-the-box.

  11. Cat: If multitouch apps get much more complex then they’ll have given up their true usefulness, I think. They’re an important innovation because they’re dramatically simpler than what we have on the desktop.

  12. Another example of in-app training that one frequently sees on the iPad is the editable getting-start doc. Each of the iWork apps comes pre-loaded with an actual document the user can interact with to teach the user about the features and UI. Much the same way games have tutorials embedded into beginning levels (e.g., Nova 2 or Orbital), this live tutorial doesn’t require the user to learn a fact sheet beforehand. Penultimate is another app that does this well.

    These apps are for content creation so they have a blank canvas they can pre-poulate. However, SketchBook Pro has help documentation much closer to the magazine apps that the user is forced to skim through on first-launch. And much more similar to traditional help docs from the desktop.

    Like cat, I’m not sure I agree that all apps should be designed in such a way that no help is required. Sure, it’s a laudable goal, but there is a big difference between complex tools like SketchBook Pro versus a media consumption app. I’m not sure where Pulse fits into the mix. It’s in between a magazine and a tool. It’s basically a tool for consuming RSS feeds that then turns into entertainment.

    I do agree that the majority of the magazine apps complicate the design. But at the end of the day it’s about context. The great thing about the iPad is that the device turns into the app you’re using. Sometimes that’s a powerful tool that requires instructions and other times it’s an entertainment device that allows users to sit back and enjoy. And much of the joy will be lost if you have to learn how to sit back.

  13. MK: I think Sketchbook Pro is an interesting case. It really does make a lot of sense on the iPad and has obviously been a success. At the same time, I have no idea how to use it, and I went to art school and don’t mind saying that I can draw. I guess I would say it succeeds in spite of itself.

  14. Funnily enough, the best app designers when it comes to this seems to be the games makers. Angry Birds and Cut the Rope – very subtle and simple explanations, where one concept is introduced at a time.

    That’s great for the stage-by-stage approach of games, of course, but it’s not always practical. Especially when you want to jump right into an app straight away.

    But maybe some of these app designers should try take a page out of the games guys’ book!

  15. Hi Khoi,

    I completely agree with your notion that interfaces shouldn’t be explained, and your examples are definitely horrible – if it is visible, it should be obvious what it does.

    But on touchscreens in particular, a lot of the core functionality is not visible, it is hidden in a gesture and gestures are not self-explanatory.

    When the iPhone launched almost all of the early adopters had already seen Steve use it, so therefore they knew how to swipe, the knew how to pinch.

    And in turn, the early adopters taught the not-so-early adopters what to do.

    The reason that almost everybody can pick up an iPhone and just use it is, that they have already bin tought how to.

    So, I mostly agree with you, but somethings have to be explained, allthough they can be explained way better than your examples.

  16. Foodily: a recipe search site has a nice coaching pop up each feature one at a time. The text is quite tight and it gives you an indication of how many tips there are and control to go to the next one.

  17. A very interesting post. It reminded me of the interactive instructions for Coda’s Transmit 4 I wonder what Khoi thinks of it and the remarkable program UI itself?

    Regards

  18. In regards to Google’s feature call-out that Andrew shared, I find it both interesting and unusable that Google would allow multiple call-outs to be layered in the following manner: link

  19. just seconding @MK and @Mads Buch Stage’s comments.

    btw, we shouldn’t forget the impact of advertising as teaching opportunities. I think if we compiled all the iPhone and iPad TV commercials, you’d have one nice little video tutorial.

    take the swipe gesture for browsing photos. did I figure it out on my own (seeing the selected photo use a slide-in transition)? or did i see it in a commercial? or did i see a friend using an iPhone? it feels completely intuitive now…but on its own, how obvious was it really?

  20. While I agree with your comments as they relate to “Introducing users to a new app or set of functionality”, I think there are different considerations when dealing with the iPad as it’s a brand new device with a different set of navigation tools from anything that came before. While most users would be familiar with touch screens, Apple introduced a new set of gestures with the iPad that most users wouldn’t necessarily know. As a developer, we gave a high priority to making the app easy to use, but we also felt it was absolutely necessary to include a brief “how-to” that users could access. The real test shouldn’t be whether you have an instructional screen, but how easy your app is to navigate.

  21. Gerry, this is very true. But also consider mouse gestures on desktop computing during the past decade. Mouse gestures have completely failed to pervade mainstream computing. I wonder if this is due to a lack of usefulness in the mouse gesture features, or a lack of training and awareness amongst users.

    If user training is necessary for touchscreen devices, I wonder how much, and in what form, user training will be necessary for atypical touch devices. For example, considering the new Sony PSP NGP, which has touch-sensitive control surfaces on the REAR of the device. What kind of visuals will need to accompany user training for this type of device? What about devices that have non-flat surfaces, such as spherical devices?

  22. Interesting thoughts.

    I do, however believe instructional screens are useful in every interface. You’ll always have some users that need help.

    Still, some interesting questions are:
    How would you change those interfaces so that they don’t require any instructional screens? What are some examples of interfaces so well designed that they don’t require them?