Touch and Gesture-Based Interfaces in Creative Workflows: A Review

Authors

  • Siti Rohaida Alimin Faculty of Creative Multimedia and Computing, Universiti Islam Selangor, Malaysia
  • Nor Musliza Mustafa Faculty of Creative Multimedia and Computing, Universiti Islam Selangor, Malaysia
  • Nur Aisya Insyira Manaf Faculty of Creative Multimedia and Computing, Universiti Islam Selangor, Malaysia
  • Helyawati Baharudin Faculty of Creative Multimedia and Computing, Universiti Islam Selangor, Malaysia
  • Farhana Abdullah Asuhaimi Faculty of Creative Multimedia and Computing, Universiti Islam Selangor, Malaysia

DOI:

https://doi.org/10.53840/myjict10-2-228

Keywords:

touch, gesture, stylus, creative workflows, AR/VR, user studies, interaction design, gesture recognition, multimodal sensing

Abstract

The world has now been introduced to touch and gesture-based interfaces as prominent tools in the creative industries, with designers and entertainers engaged with digital media. There are problems or shortcomings, such as ergonomic fatigue and precision in operating these touchscreens. This paper aims to comprehensively examine the effectiveness and ergonomic considerations of these interfaces in facilitating precision, expressiveness and intuitive interaction. A structured literature review was conducted, focusing on peer-reviewed articles published from 2020 to 2025, highlighting empirical studies in creative human-computer interaction (HCI). The study presents that hybrid approaches—combining touch, stylus and gesture input—are increasingly being used in creative software and hardware, offering more adaptable and inclusive designs. The findings support the need for the development of future user-centred interfaces that balance natural interaction with functional control, ensuring accessibility, reducing fatigue and increasing creative output. The limitation of this study focuses on specific interface types and short-term usage contexts.

Downloads

Download data is not yet available.

References

Aleksei. (2024). The Future of UI: Designing for Voice and Gesture | by Aleksei | Medium. Medium. https://medium.com/@Alekseidesign/the-future-of-ui-designing-for-voice-and-gesture-84f5f7061c65

Arora, R. (2020). Creative expression with immersive 3D interactions. Conference on Human Factors in Computing Systems - Proceedings. https://doi.org/10.1145/3334480.3375028

Bailenson, J. N., Yee, N., Blascovich, J., Beall, A. C., Lundblad, N., & Jin, M. (2008). Wang, X., Zhu, C., Du, H., & Zhang, D. (2017). (2023). Exploring the Role of Gestural Interaction in User Interface Design: Challenges and Opportunities | Association of Human-Computer Interaction. HCI Association of Human-Computer Interaction. https://www.hci.org.uk/article/exploring-the-role-of-gestural-interaction-in-user-interface-design-challenges-and-opportunities/

Buxton, B. (2010). 31.1: Invited paper: A touching story: A personal perspective on the history of touch interfaces past and future. 48th Annual SID Symposium, Seminar, and Exhibition 2010, Display Week 2010, 1(May), 444–448. https://doi.org/10.1889/1.3500488

BUXTON, W. (2020). There’s More to Interaction Than Meets the Eye: Some Issues in Manual Input. In User Centered System Design (pp. 319–338). https://doi.org/10.1201/b15703-15

Christian Bastien and Dominique Scapin’s. (2025). Bastien & Scapin Ergonomic Criteria for the Evaluation of Human-Computer Interfaces – Capian. Capian & UX-Co. https://capian.co/ergonomic-criteria-bastien-scapin

Daniel Wigdor and Dennis Wixon. (2011). Brave NUI world:designing natural user interfaces for touch and gesture. ACM SIGSOFT Software Engineering Notes, 36(6), 29–30. https://doi.org/10.1145/2047414.2047439

Gheran, B. F., Villarreal-Narvaez, S., Vatavu, R. D., & Vanderdonckt, J. (2022). RepliGES and GEStory: Visual Tools for Systematizing and Consolidating Knowledge on User-Defined Gestures. ACM International Conference Proceeding Series. https://doi.org/10.1145/3531073.3531112

Greenberg, S., Carpendale, S., Marquardt, N., & Buxton, B. (2012). Sketching User Experiences. In Sketching User Experiences. https://doi.org/10.1016/C2009-0-61147-8

Hao, Z., Sun, Z., Li, F., Wang, R., & Peng, J. (2024). Millimeter wave gesture recognition using multi-feature fusion models in complex scenes. Scientific Reports, 14(1), 1–21. https://doi.org/10.1038/s41598-024-64576-6

Ivan Edward Sutherland. (2003). Sketchpad: A man-machine graphical communication system. In University of Cambridge Computer Laboratory (Vol. 30, Issue 1). https://doi.org/10.5802/ambp.417

Jaanhavi bansal. (2023). Gesture-based Human-Computer Interaction using Wearable Devices. International Journal for Research Publication and Seminar, 14(4), 141–150. https://doi.org/10.36676/jrps.2023-v14i4-020

Liu, X., Zhang, Y., & Tong, X. (2024). Touchscreen-based Hand Tracking for Remote Whiteboard Interaction. UIST 2024 - Proceedings of the 37th Annual ACM Symposium on User Interface Software and Technology. https://doi.org/10.1145/3654777.3676412

Matthis Rousselle. (2025). How Touchless Interaction is Changing UX/UI Design | by Matthis Rousselle | UX Planet. UX Planet. https://uxplanet.org/how-touchless-interaction-is-changing-ux-ui-design-588300e0d053

Mo, G. B., Dudley, J. J., & Kristensson, P. O. (2021). Gesture Knitter: A Hand Gesture Design Tool for Head-Mounted Mixed Reality Applications. Conference on Human Factors in Computing Systems - Proceedings. https://doi.org/10.1145/3411764.3445766

Myers, B., Hudson, S. E., & Pausch, R. (2000). Past, Present, and Future of User Interface Software Tools. ACM Transactions on Computer-Human Interaction, 7(1), 3–28. https://doi.org/10.1145/344949.344959

Osman Hashi, A., Zaiton Mohd Hashim, S., & Bte Asamah, A. (2024). A Systematic Review of Hand Gesture Recognition: An Update from 2018 to 2024. IEEE Access, 12(1), 143599–143626. https://doi.org/10.1109/ACCESS.2024.3421992

Pan, L., Yu, C., He, Z., & Shi, Y. (2023). A Human-Computer Collaborative Editing Tool for Conceptual Diagrams. Conference on Human Factors in Computing Systems - Proceedings. https://doi.org/10.1145/3544548.3580676

Philippe H. (2025). How Gesture-Based Interaction Is Transforming UX/UI Design - Raw.Studio. RAW. https://raw.studio/blog/how-gesture-based-interaction-is-transforming-ux-ui-design/

Płaza, G., Kabiesz, P., & Jamil, T. (2025). ERGONOMICS / HUMAN FACTORS IN THE ERA OF SMART AND SUSTAINABLE INDUSTRY : INDUSTRY 4 . 0 / 5 . 0 University of the Witwatersrand. Management Systems in Production Engineering, 0. https://doi.org/10.2478/mspe-2025-0022

Potts, D., Dabravalskis, M., & Houben, S. (2022). TangibleTouch: A Toolkit for Designing Surface-based Gestures for Tangible Interfaces. ACM International Conference Proceeding Series. https://doi.org/10.1145/3490149.3502263

Raswan, M., Kay, T., Camarillo-Abad, H. M., Cibrian, F. L., & Qi, T. Di. (2023). Guess the Gesture: Uncovering an Intuitive Gesture-based User Interface for 3D Content Interaction in Virtual Reality. ACM International Conference Proceeding Series, 361–364. https://doi.org/10.1145/3591196.3596610

Rob TannenRob Tannen. (2025). Ergonomics for Interaction Designers Understanding and applying physical fit in user interface research & design.

Rogers Y., Sharp H., P. J. (2011). INTERACTION DESIGN: beyond human-computer interaction, 3rd Edition (3rd Editio). John Wiley & Sons. http://proquestcombo.safaribooksonline.com/book/web-development/usability/9780470665763/chapter-11-design-prototyping-and-construction/navpoint-86?uicode=open

Sai, M. S. S., Sunaina, S., Sravanthi, Y., & Mounika, S. (2024). AI - Enhanced Hand Gestures for Dynamic Presentations. International Journal of Innovative Research in Science ,Engineering and Technology, 13(3). https://doi.org/10.15680/IJIRSET.2024.1303158

Scott, S. M., & Raftery, C. (2021). Brain-Computer Interfaces and Creative Expression: Interface Considerations for Rehabilitative and Therapeutic Interactions. Frontiers in Computer Science, 3, 718605. https://doi.org/10.3389/FCOMP.2021.718605/BIBTEX

Shahi, S., Mollyn, V., Park, C. T., Kang, R., Liberman, A., Levy, O., Gong, J., Bedri, A., & Laput, G. (2024). Vision-Based Hand Gesture Customization from a Single Demonstration. UIST 2024 - Proceedings of the 37th Annual ACM Symposium on User Interface Software and Technology. https://doi.org/10.1145/3654777.3676378

Tang, G., Wu, T., & Li, C. (2023). Dynamic Gesture Recognition Based on FMCW Millimeter Wave Radar: Review of Methodologies and Results. Sensors, 23(17), 1–19. https://doi.org/10.3390/s23177478

Ye, Q., Yong, Z. Z., Han, B., Yen, C. C., & Zheng, C. (2024). PaperTouch: Tangible Interfaces through Paper Craft and Touchscreen Devices. Conference on Human Factors in Computing Systems - Proceedings. https://doi.org/10.1145/3613904.3642571

Yoon, Y., Wolfert, P., Kucherenko, T., Viegas, C., Nikolov, T., Tsakov, M., & Henter, G. E. (2022). The GENEA Challenge 2022: A large evaluation of data-driven co-speech gesture generation. In ACM International Conference Proceeding Series (Vol. 1, Issue 1). Association for Computing Machinery. https://doi.org/10.1145/3536221.3558058

Zindulka, T., Sekowski, J. M., Lehmann, F., & Buschek, D. (2025). Exploring Mobile Touch Interaction with Large Language Models. Conference on Human Factors in Computing Systems - Proceedings . https://doi.org/10.1145/3706598.3713554

Zou, Q., Bai, H., Gao, L., Lee, G. A., Fowler, A., & Billinghurst, M. (2024). Stylus and Gesture Asymmetric Interaction for Fast and Precise Sketching in Virtual Reality. International Journal of Human-Computer Interaction, 40(23), 8124–8141. https://doi.org/10.1080/10447318.2023.2278294

Published

21-12-2025

Issue

Section

Articles

How to Cite

Alimin, S. R., Mustafa, N. M., Manaf, N. A. I., Baharudin, H., & Abdullah Asuhaimi, F. (2025). Touch and Gesture-Based Interfaces in Creative Workflows: A Review. Malaysian Journal of Information and Communication Technology (MyJICT), 10(2), 69-82. https://doi.org/10.53840/myjict10-2-228

Share