(cache)Attention Modeling with Temporal Shift in Sign Language Recognition | IEEE Conference Publication | IEEE Xplore

Attention Modeling with Temporal Shift in Sign Language Recognition

Publisher: IEEE

Abstract:

Sign languages are visual languages expressed with multiple cues including facial expressions, upper-body and hand gestures. These different visual cues can be used toget...View more

Abstract:

Sign languages are visual languages expressed with multiple cues including facial expressions, upper-body and hand gestures. These different visual cues can be used together or at different instants to convey the message. In order to recognize sign languages, it is crucial to model what, where and when to attend. In this study, we developed a model to use different visual cues at the same time by using Temporal Shift Modules (TSMs) and attention modeling. Our experiments are conducted with BospohorusSign22k dataset. Our system has achieved 92.46% recognition accuracy and improved the performance approximately 14% compared to the baseline study with 78.85% accuracy.
Date of Conference: 15-18 May 2022
Date Added to IEEE Xplore: 29 August 2022
ISBN Information:
Print on Demand(PoD) ISSN: 2165-0608
Publisher: IEEE
Conference Location: Safranbolu, Turkey