Chronicles of a Windows user gone iPhone (part 4)

Advertisement

I’ve been using the iPhone 6s with Apple’s new iOS 11 for a while now. I’ve criticized the intrusive set up process, the poor inconsistent one-handed usability issues, and a few app annoyances.  In part 4 I’ve discovered even more awkward interface issues and but maybe I’ll come to terms with them after all. If you missed part 1, part 2, or part 3, you can check those out as well.

More secret gestures

In part 2 I learned about the secret left-edge screen gesture that sometimes acts as a back button, sometimes opens hidden menus, sometimes does other things, and sometimes does nothing. Swiping horizontally is a pretty good navigation method for handheld smartphones. It reminds me of the old panoramic hubs in Windows Phone 7, except those were much better in a few key ways. In Windows Phone 7, when you activated a hub, its full length 2D panorama graphic flew into view like a sheet of paper and then settled on one specific section. This animation was a brilliant way of indicating to the user that there was more content to the left and right. If that wasn’t enough, the hub designs landed in a way that cropped a bit of text on the edges. Again, this was a very smart way of showing the user that there’s more stuff beyond the edges. Anyone who understands our language can recognize those letters and instantly see that some part of the word is missing on the edge. Tapping and dragging the rest of the word into view was totally intuitive for being able to see the hidden content. There are no such visual indicators at all for teaching users about the left-edge swipe gesture in iOS. It’s completely non-discoverable and I never would have found it if someone didn’t tell me about it.

The fly-in hub designs were great for showing side scrolling capabilities.

But wait, there are more hidden gesture controls! I already knew about the top-edge screen gesture since Android and Windows Phone 8 both implemented that long ago. It’s still a bad design decision, but at least I knew about it. On iOS, there’s also a bottom edge swipe gesture.  Swiping up from the bottom edge shows a pile of ambiguous icons that turn out to be toggle switches. The bottom row can be customized in the settings to launch a handful of different things without having to go back to the home screen, and that’s a really great option. The mess of unlabeled icons can be really hard to understand though. I customized my bottom row to show some options that I thought would be useful, but I’ve already forgotten what the icons actually do so I’ll probably have to go back into the settings to read the labels there. Having this kind of thing in the bottom edge screen gesture is great because the bottom edge of the screen is very easy to reach while holding the phone. It’s not great that the discover-ability is really low and it’s not great that the icons are unclear. There are also some issues with the radio toggle buttons not actually functioning as radio power toggles.

Now here’s an even crazier hidden gesture control.  If I swipe down on the home screen, the system search appears along with the keyboard right away!  This was a totally accidental find. I was probably trying to access the notifications drop down with a top-edge screen swipe.  If you miss the top edge, then you get the search. Yep, that’s 2 top-down swipe gestures that do completely different things. Simple and easy? Not really! These complicated gestures certainly do add efficiency for power-users to navigate the operating system, but simple and intuitive they are not.

Arranging icons

As I found in part 3, having different icons for different email accounts is an improvement, but trying to arrange them is awful. In iOS 4, it used to be that I could tap and hold on an icon on the home screen, then all of them would start wiggling and that would mean I could rearrange them… or uninstall them by tapping the little X in the corner of each icon.  Nowadays, tap & hold brings up a context sensitive menu related to the app I tapped & held my finger on. But wait!! Sometimes tap & hold DOES go to icon rearrangement mode.

It turns out the tap & hold action has two different functions depending on the radius of your finger’s contact area or something.  How is that intuitive or simple?  Sorry, it isn’t. Apple calls this awkward mash up of a single control method for two functions “force touch 3D touch” because generally you have to press harder to get a larger touch radius. In practice it’s very difficult to figure out the difference between activating 3D touch and activating tap & hold. Usually when I want one function to activate, the other one does. I’m sure this takes some “getting used to” and eventually I’ll adapt to this convoluted user-interaction method.  Actually, it looks like if I adapt to the phone and touch really really lightly I can enable the icon arrangement mode… while if I press really really hard I can activate the context sensitive menu. Seriously unintuitive!

Maybe all of these hidden controls are good

For beginning users, hidden controls like all of these screen gestures and variable pressure controls and double taps and double presses are clearly a bad thing. They increase complexity and are impossible to quickly learn without being taught (or by accident). In observing or talking to my friends who have always used iPhones, many of them don’t even know about some of these gestures. Many of these gestures don’t actually work on their iPhones because they have cases that cover the edges of the bezel.

On the other hand, these hidden commands actually make using iOS much more efficient for people who know about them.  If I can remember which secret gesture does what or build some motor memory for them, it actually makes getting things done much faster than simply poking about at buttons on the screen.

Share This Post
Advertisement
What's your reaction?
Love It
57%
Like It
7%
Want It
0%
Had It
0%
Hated It
36%
About The Author
Adam Z. Lein

Adam has had interests in combining technology with art since his first use of a Koala pad on an Apple computer. He currently has a day job as a graphic designer, photographer, systems administrator and web developer at a small design firm in Westchester, NY. His love of technology extends to software development companies who have often implemented his ideas for usability and feature enhancements. Mobile computing has become a necessity for Adam since his first Uniden UniPro PC100 in 1998. He has been reviewing and writing about smartphones for Pocketnow.com since they first appeared on the market in 2002. Read more about Adam Lein!