Announcements |   placeholder

Beyond Touch, Swipe and Tap for Smart Clothing!​

‘Smart clothing’ provides a unique and creative potential for interaction with the digital world due to its novel easy-to-access and always-available features. Most existing work focuses on emulating limited 2D surface gestures of touchscreen (touch, tap and swipe), and thus does not fully take advantage of the flexible, deformable and tangible material properties of textile. This talk explores new opportunities of expressive interactions with a set of advanced 2.5D deformation gestures (Twirl, Twist, Fold, Push, Bend, Grasp, Stretch and Shake) on textile; and will also demonstrate a working prototype of ‘SmartSleeve’. Unlike conventional methods, our unique paradigm of shape deformation can enhance the existing digital interaction experience by ‘retro-fitting’ with metaphors from the real world, which makes it possible to go beyond the current 2D gestures. Another creative aspect of 2.5D gestures involves evoking nostalgic memories of vintage products that also improves usability, especially among the elderly who struggle to interact with modern interfaces. For instance, ‘twisting’ the textile affords rotational control, like a physical knob that can be used to change volume in virtual services (eg. Spotify). While ‘stretching’ allows the analogy of elastic input that can control the playback speed. Presently, such different operations are controlled by homogeneous actions (often sliding). However, our methodology provides a wide range of ergonomic interactions in congruence with the end users’ mental models. Our working prototype ‘SmartSleeve’, is a tactile pressure sensing sleeve that can sense 2.5D and 2D gestures. It is designed to be directly worn on the skin. We also developed a realtime recognition algorithm for 22 Gestures. Our classifier has shown an accuracy of 89.5%. In summary, this work is unique as we combine the pressure with deformation to provide versatile input modality.

Speakers

Related documents