MA in User Experience Design project work

Gesture library

A concept for standardising and categorising gesture-based interactions.

A small-scale practice-based design research project that combined academic study with practical application.

There appeared to be very little consensus in how gesture-based interactions were being described, documented or used.

I conceived the idea of the Gesture library, a standardisation and categorisation resource for interaction designers.

I created a prototype of the Gesture library as a proposed solution to problems that I had identified through research.

A latte and an idea

This project was inspired by a visit to a local café. When my drink arrived, it was presented in a jug that I found tricky to use. This was because, as a left-handed person, the positioning of the handle and spout were wrong for me.

It made me wonder whether the needs of left-handed people were being considered in the design and development of gesture-based interactions.

This became the starting point and initial focus for my research.

Left out

Despite there being a lot of studies being published, covering a wide range of gestures (Villarreal-Narvaez et al. 2020), I couldn’t find any that focused on left-handedness or considered hand dominance as a factor that might influence a user’s experience.

Of the 10 peer-reviewed research papers I reviewed, only one specifically stated that they had used left and right-handed participants. Two research studies involved left-handed participants but asked them to use their right hand throughout.

Gesture elicitation is a term that covers different types of study that all aim to understand users’ preferences and which gestures feel the most intuitive for various actions. The purpose of doing this is to then inform how gesture-based interactions are designed.

The most common types of gestures being studied involved arms, hands and fingers. This didn’t surprise me; it’s what first came to mind when I started researching this area.

Mentions of hand dominance in studies

  • ARORA, Rahul et al. 2019. ‘MagicalHands: Mid-Air Hand Gestures for Animating in VR’. In Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology. UIST ’19: The 32nd Annual ACM Symposium on User Interface Software and Technology, New Orleans LA USA, 17 October 2019, 463–77. Available at: https://dl.acm.org/doi/10.1145/3332165.3347942 [accessed 9 Oct 2022].

    FRANÇOISE, Jules and Frédéric BEVILACQUA. 2018. ‘Motion-Sound Mapping through Interaction: An Approach to User-Centered Design of Auditory Feedback Using Machine Learning’. ACM Transactions on Interactive Intelligent Systems 8(2), [online], 1–30. Available at: https://dl.acm.org/doi/10.1145/3211826 [accessed 9 Oct 2022].

    HESENIUS, Marc and Volker GRUHN. 2019. ‘GestureCards: A Hybrid Gesture Notation’. Proceedings of the ACM on Human-Computer Interaction 3(EICS), [online], 1–35. Available at: https://dl.acm.org/doi/10.1145/3331164 [accessed 9 Oct 2022].

    KATSURAGAWA, Keiko et al. 2019. ‘Bi-Level Thresholding: Analyzing the Effect of Repeated Errors in Gesture Input’. ACM Transactions on Interactive Intelligent Systems 9(2–3), [online], 1–30. Available at: https://dl.acm.org/doi/10.1145/3181672 [accessed 9 Oct 2022].

    LAM, Kevin C., Carl GUTWIN, Madison KLARKOWSKI and Andy COCKBURN. 2022. ‘More Errors vs. Longer Commands: The Effects of Repetition and Reduced Expressiveness on Input Interpretation Error, Learning, and Effort’. In CHI Conference on Human Factors in Computing Systems. CHI ’22: CHI Conference on Human Factors in Computing Systems, New Orleans LA USA, 29 April 2022, 1–17. Available at: https://dl.acm.org/doi/10.1145/3491102.3502079 [accessed 9 Oct 2022].

    SHARMA, Adwait et al. 2021. ‘SoloFinger: Robust Microgestures While Grasping Everyday Objects’. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. CHI ’21: CHI Conference on Human Factors in Computing Systems, Yokohama Japan, 6 May 2021, 1–15. Available at: https://dl.acm.org/doi/10.1145/3411764.3445197 [accessed 9 Oct 2022].

    SHARMA, Adwait, Joan Sol ROO and Jürgen STEIMLE. 2019. ‘Grasping Microgestures: Eliciting Single-Hand Microgestures for Handheld Objects’. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. CHI ’19: CHI Conference on Human Factors in Computing Systems, Glasgow Scotland Uk, 2 May 2019, 1–13. Available at: https://dl.acm.org/doi/10.1145/3290605.3300632 [accessed 9 Oct 2022].

    WEI, Haowen et al. 2022. ‘IndexPen: Two-Finger Text Input with Millimeter-Wave Radar’. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 6(2), [online], 1–39. Available at: https://dl.acm.org/doi/10.1145/3534601 [accessed 9 Oct 2022].

    YU, Nan, Wei WANG, Alex X. LIU and Lingtao KONG. 2018. ‘QGesture: Quantifying Gesture Distance and Direction with WiFi Signals’. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 2(1), [online], 1–23. Available at: https://dl.acm.org/doi/10.1145/3191783 [accessed 9 Oct 2022].

    ZHANG, Qian et al. 2021. ‘Write, Attend and Spell: Streaming End-to-End Free-Style Handwriting Recognition Using Smartwatches’. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 5(3), [online], 1–25. Available at: https://dl.acm.org/doi/10.1145/3478100 [accessed 9 Oct 2022].

The impact of unmet needs

It is estimated that left-handed people make up 10-15% of the global population (Porac 2016). This is a significant number of people whose needs are potentially not being met by gesture-based interactions.

A lot of the world we live in has been designed with right-handed people in mind. Scissors are often mentioned as an example but it goes beyond that. Can openers, vending machines, power tools, and buttons on clothing are primarily designed to be used by right-handed people (Kohlstedt 2020).

My own primary research, conducted using a self-completion questionnaire, looked into the challenges faced by left-handed people and how this impacts their daily lives. The questionnaire gathered 42 responses from left-handed participants.

Noticing problems

85.7% of respondants said they do notice themselves experiencing problems when using phyiscal or digital objects that have been designed for right-handed use as the default.

How often are problems noticable?

None of the respondents said that they notice these challenges all of the time but the majority said that they notice some (53.7%) or a lot (24.4%) of the time.

The impact

However despite experiencing these challenges frequently, most respondents consider the impact to be quite small. Only one person felt it affected them a lot.

My research indicated that the impact is small but it is frequent and noticeable.

Designing and developing gesture-based interactions is complex and it requires multiple disciplines to work together throughout the entire process (Hesenius and Gruhn 2019). This complexity could be why research and development in this area has focused on right-handed users; to control a variable and maintain some simplicity.

However, not designing gesture-based interactions for left-handed users could add to that small but noticeable impact of a lower quality experience.

Observations

I invited a selection of left-handed users to participate in observation sessions, conducted remotely via Microsoft Teams. During these sessions, I described six common gesture-based interactions and asked participants to perform each one.

I then compared my own observations with diagrams of the same gestures performed by right-handed people, included in the literature I had reviewed.

The following gestures were performed in the same way by all of my participants, but in a different way to the diagrams showing right-handed users:

  • Draw a V-shape in the air

  • Palm facing down to palm facing up

The following gestures were performed in the same way by all participants and matched the diagrams:

  • Fist shape to outstretched hand

  • Pinch thumb and finger

The following gestures were performed differently by each participant:

  • Draw an X-shape in the air

  • Swiping to the side

So while there does appear to be differences in how left and right-handed people perform gesture-based interactions, some gestures are still different for users with the same dominant hand.

Variations in performance are likely to impact how those gestures are recognised by sensing technology and then interpreted by digital systems.

Describing and documenting

A lack of clear and standardised documentation or shared language to describe gesture-based interactions was an issue that appeared to be repeated across many of the research papers I reviewed.

I realised addressing this need would make a useful contribution to the interaction design community. It could also provide a way to indicate which gestures are affected by difference in hand dominance.

Inspired by Google’s Material Design resources, design systems and style guide documentation, I proposed creating the Gesture library.

I created this mood board while researching different approaches to style guides. You can explore it yourself here.

A first iteration

The wireframes featured in this video can be explored here.

Due to time constraints and my own lack of Figma experience at the time, this was as far as I was able to take the concept. There are quite a few aspects I would like to improve.

I’d like to fill in some of the gaps in content, making the protoype more high fidelity and communicating the concept more clearly.

I’d also like to spend some time refining the visual design aspects of this prototype. The current visuals are inspired by labanotation, a system of recording movement that originates from dance and choreography (Guest 2011). However, I think this could be developed further to make that connection clearer and better support the gesture library’s written content.

Project limitations

As with many research projects, it’s important to be aware of the limitations of this study, and how they may have impacted my work.

  • As this research was carried out for a 12-week, part-time MA module, the most obvious limitations for this study are time and budget. These factors strongly influenced the research area as well as the tools and methods used throughout.

    Ideally, a pilot study would have been carried out to test the validity of the questionnaire and observations. However, this wasn’t possible within the time available.

  • As a Masters-level student, I was a relatively inexperienced researcher.

    Part-way through the project, additional key words were suggested that could have resulted in a more effective literature search. However, at that point in the project there wasn’t enough time to return to that stage of the process. As a result, relevant, existing research could have been missed.

    If this project were to be expanded upon in future, this is something that could be improved, adding the key words “HCI” and “Human Computer Interaction” to the searches and exploring the results.

  • Despite efforts to remain objective, there may be some bias in the way this research has been conducted, analysed and reported because I am left-handed.

  • Participants were selected using my own social media channels which might have limited diversity and excluded anyone who does not use social media. This, and the self-selection nature of the questionnaire, might also have attracted people who were particularly keen to discuss the challenges they face as a result of being left-handed.

    The questionnaire did achieve a good response rate, however ideally there should have been more observation participants. Three participants was enough to indicate user behaviour but a sample size of 20 or more may have led to “more meaningful conclusions” (Loranger 2016).

Future steps

Having proposed Gesture library as a solution to the problems I had identified, the next step would be to start testing it. This would help me to determine how well it solves the issues and how useful it would be as a resource for interaction designers.

If the concept proves to be viable, the next steps would be to start refining it by improving the visuals and filling in any missing content. As the concept is refined and develop, it would need to be tested again, following the design, test, iterate loop that is characteristic of the agile working approach (Waldock 2015),(Norman 2019).

The next step would then be to begin gathering information by carrying out Gesture Elicitation Studies (GES) and then organising that information to populate the library.

Each library entry should include information about:

  • What the gesture is called

  • How the gesture is performed

  • How intuitive (or discoverable) it is

  • Accessibility and limitations of the gesture

This would be an enormous task for one person or organisation. Crowdsourcing could provide a feasible way to achieve this. Training and information templates could be produced to maintain consistency across the library.

  • GUEST, Ann Hutchinson. 2011. Labanotation: The System of Analyzing and Recording Movement. New York: Routledge.

    HESENIUS, Marc and Volker GRUHN. 2019. ‘GestureCards: A Hybrid Gesture Notation’. Proceedings of the ACM on Human-Computer Interaction 3(EICS), [online], 1–35. Available at: https://dl.acm.org/doi/10.1145/3331164 [accessed 9 Oct 2022].

    KOHLSTEDT, Kurt. 2020. ‘Left Behind: Persistent Frustrations in a World Designed for Right-Handers’. 99% Invisible [online]. Available at: https://99percentinvisible.org/article/left-behind-persistent-frustrations-in-a-world-designed-for-right-handers/ [accessed 10 Jan 2024].

    LORANGER, Hoa. 2016. ‘Checklist for Planning Usability Studies’. Nielsen Norman Group [online]. Available at: https://www.nngroup.com/articles/usability-test-checklist/ [accessed 10 Jan 2024].

    NORMAN, Don. 2019. Observe, Test, Iterate, and Learn [Film]. Available at: https://www.youtube.com/watch?v=JgPppwsocRU [accessed 11 Jan 2024].

    PORAC, Clare. 2016. Laterality: Exploring the Enigma of Left -Handedness. Amsterdam ; Boston: Elsevier/AP, Academic Press is an imprint of Elsevier.

    VILLARREAL-NARVAEZ, Santiago, Jean VANDERDONCKT, Radu-Daniel VATAVU and Jacob O. WOBBROCK. 2020. ‘A Systematic Review of Gesture Elicitation Studies: What Can We Learn from 216 Studies?’ In Proceedings of the 2020 ACM Designing Interactive Systems Conference. DIS ’20: Designing Interactive Systems Conference 2020, Eindhoven Netherlands, 3 July 2020, 855–72. Available at: https://dl.acm.org/doi/10.1145/3357236.3395511 [accessed 9 Oct 2022].

    WALDOCK, Belinda`. 2015. Being Agile in Business. Pearson Education Limited.

Find out more

If you found this case study interesting and would like to learn more about the projects I completed for my MA in UX Design, you can explore the blog posts below.

Previous
Previous

A wedding journey

Next
Next

Roundabout website redesign