Solar is a new weather app for the iPhone that a lot of people are — not incorrectly! — peeing their pants over. Not so long ago, people had the same reaction to Clear, a to-do list. The thing the have in common? Well, it’s more like things that don’t they have in common — any kind of interface chrome or obvious buttons. You’re presented with information and nothing else on the screen. They’re almost entirely gesture-driven: swipe here, pinch there, flick that.
Apps designed this way are almost reflexively described as “beautiful” and “intuitive.” Beautiful, yes. But reflexively calling them intuitive because they’re almost entirely gesture-driven takes for granted that natural input (i.e., multitouch and gestures) is inherently intuitive — that users will look at these apps or use them for a few seconds and know, instantly, how to use them. This isn’t necessarily true.
Even though Apple has developed a language of gestures and interaction paradigms that provide a solid foundation for most users to draw on when using an iOS app for the first time, simply looking at Solar or Clear will tell you nothing about how to use them — in fact, part of their appeal, at least to advanced users, is that they break from some of those paradigms. And while you can figure out these apps and a great deal of their invisible language fairly quickly, the full range of gestures and what they do is not immediately apparent to some users.
Of course, any interface in any app has some kind of learning curve, but as this kind of app and interaction design becomes more common, particularly in wonderful apps like Clear and Solar, it’s important to resist the temptation to frame anything that moves us away from the last 20 or so years of computer interfaces as necessarily “natural” or “intuitive.” It’s exciting, it really is. But it’s not always natural.