Abstract
Research on gestural control interfaces is getting more widespread for the purpose of creating natural interfaces. Two of these popular gesture types are freehand and on-skin touch gestures, because they eliminate the use of an intermediary device. Previous studies investigated these modalities separately with user-elicitation methods; however, there is a gap in the field considering their comparison. In this study, we compare user-elicited on-skin touch and freehand gesture sets to explore users' preferences. Thus, we conducted an experiment in which we compare 13 gestures to control computer tasks for each set. Eighteen young adults participated in our study and filled our survey consisted of NASA Task Load Index and 4 additional items of social acceptability, learnability, memorability, and the goodness. The results show that on-skin touch gestures were less physically demanding and more socially acceptable compared to freehand gestures. On the other hand, freehand gestures were more intuitive than on-skin touch gestures. Overall, our results suggest that different gesture types could be useful in different scenarios. Our contribution to the field might inspire designers and developers to make better judgments for designing new gestural interfaces for a variety of devices.
Role: Concept Creator, Supervisor
Type: Full Paper
Conference: Distributed, Ambient and Pervasive Interactions
Stats: h5-index:34
Date: 2017
Co-Authors: Hayati Havlucu, Mehmet Yarkın Ergin, İdil Bostan, Tilbe Göksun and Oğuzhan Özcan
Abstract
Hand-specific on-skin (HSoS) gestures are a trending interaction modality yet there is a gap in the field regarding users' preferences about these gestures. Thus, we conducted a user-elicitation study collecting 957 gestures from 19 participants for 26 commands. Results indicate that (1) users use one hand as a reference object, (2) load different meanings to different parts of the hand, (3) give importance to hand-properties rather than the skin properties and (4) hands can turn into self-interfaces. Moreover, according to users' subjective evaluations, (5) exclusive gestures are less tiring than the intuitive ones. We present users' subjective evaluations regarding these and present a 33-element taxonomy to categorize them. Furthermore, we present two user-defined gesture sets; the intuitive set including users' first choices and natural-feeling gestures, and the exclusive set which includes more creative gestures indigenous to this modality. Our findings can inspire and guide designers and developers of HSoS.
Role: Concept Creator, Main Co-Author, Design Researcher
Type: Full Paper
Conference: Proceedings of the 2017 Conference on Designing Interactive Systems
Stats: Acceptance rate: 26%, h5-index:41
Date: 2017
Co-Authors: İdil Bostan, Mert Canat, Mustafa Ozan Tezcan, Celalettin Yurdakul, Tilbe Göksun and Oğuzhan Özcan
Abstract
Gesture-elicitation studies are common and important studies for understanding user preferences. In these studies, researchers aim at extracting gestures which are desirable by users for different kinds of interfaces. During this process, researchers have to manually analyze many videos which is a tiring and a time-consuming process. Although current tools for video analysis provide annotation opportunity and features like automatic gesture analysis, researchers still need to (1) divide videos into meaningful pieces, (2) manually examine each piece, (3) match collected user data with these, (4) code each video and (5) verify their coding. These processes are burdensome and current tools do not aim to make this process easier and faster. To fill this gap, we developed "GestAnalytics" with features of simultaneous video monitoring, video tagging and filtering. Our internal pilot tests show that GestAnalytics can be a beneficial tool for researchers who practice video analysis for gestural interfaces.
Role: Concept Creator, Developer, Main Author, Design Researcher
Type: Extended Abstract (Poster)
Conference: Proceedings of the 2017 ACM Conference Companion Publication on Designing Interactive Systems
Stats: h5-index:41
Date: 2017
Co-Authors: Oğuzhan Özcan