Sign language can make possible effective communication between hearing and deaf-mute people. Despite years of extensive pedagogical research, learning sign language remains a formidable task, with the majority of the current systems relying extensively on online learning resources, presuming that users would regularly access them; yet, this approach can feel monotonous and repetitious. Recently, gamification has been proposed as a solution to the problem, however, the research focus is on game design, rather than user experience design. In this work, we present a system for user-defined interaction for learning static American Sign Language (ASL), supporting gesture recognition for user experience design, and enabling users to actively learn through involvement with user-defined gestures, rather than just passively absorbing knowledge. Early findings from a questionnaire-based survey show that users are more motivated to learn static ASL through user-defined interactions.
Wang, J., Ivrissimtzis, I., Li, Z., Zhou, Y., & Shi, L. (2023). User-Defined Hand Gesture Interface to Improve User Experience of Learning American Sign Language. In C. Frasson, P. Mylonas, & C. Troussas (Eds.), Augmented Intelligence and Intelligent Tutoring Systems: 19th International Conference, ITS 2023, Corfu, Greece, June 2-5, 2023, Proceedings (479-490). Springer Verlag. https://doi.org/10.1007/978-3-031-32883-1_43