UX Design Bits - Audiosume - Learn UI - From Buttons to Gestures

Gestures are not limited to a small area of screen real estate and, in comparison with buttons, don't require much attention. The difference is noticeable when using a product on the go. Audiosume uses gestures for critical actions. One of the questions is how to communicate the gesture affordance and make it learnable.

 
 

Apparent considerations, e.g., introducing the gestures during onboarding, were discarded. It's highly unlikely a person is willing to learn something when there's not a perceived benefit. Onboarding is already throwing a lot of new information to the user.

The challenge has been to find the appropriate moment to introduce a gesture. One of the promising alternatives has been to introduce the gesture with voice. When the person, for example, selected the button to edit news sources, the app would mention the ability to swipe down instead. It didn't work. Once people got to the news sources, they were focused on specific tasks. Introducing a gesture at that moment was a distraction.

 
UX Design Bits - Audiosume - Learn UI - usability - testing

In the end, the best results were achieved with an alternative that introduces the gestures after the person completes the action. In the example of the news sources button, the app is introducing the gesture to swipe down, after the person completes the task and returns to the main screen.

If the person uses the gesture the next time, the appropriate button disappears. If that's not the case, the app doesn't talk about the gesture anymore. The UI adapts to the users' behavior.