On-Skin Interaction
Hands as a Controller: User Preferences for Hand Specific On-Skin Gestures

Abstract

Hand-specific on-skin (HSoS) gestures are a trending interaction modality yet there is a gap in the field regarding users' preferences about these gestures. Thus, we conducted a user-elicitation study collecting 957 gestures from 19 participants for 26 commands. Results indicate that (1) users use one hand as a reference object, (2) load different meanings to different parts of the hand, (3) give importance to hand-properties rather than the skin properties and (4) hands can turn into self-interfaces. Moreover, according to users' subjective evaluations, (5) exclusive gestures are less tiring than the intuitive ones. We present users' subjective evaluations regarding these and present a 33-element taxonomy to categorize them. Furthermore, we present two user-defined gesture sets; the intuitive set including users' first choices and natural-feeling gestures, and the exclusive set which includes more creative gestures indigenous to this modality. Our findings can inspire and guide designers and developers of HSoS.

 

Role: Concept Creator, Main Co-Author, Design Researcher

Type: Full Paper

Conference: ACM Conference on Designing Interactive Systems [accceptance rate: %24, h5-index: 31]

Date: 2017

Co-Authors: İdil Bostan, Mert Canat, Mustafa Ozan Tezcan, Celalettin Yurdakul, Tilbe Göksun,

Oğuzhan Özcan (Advisor)

 

PDF Slides Cite ACM
GestAnalytics: Experiment and Analysis Tool for
Gesture-Elicitation Studies

Abstract

Gesture-elicitation studies are common and important studies for understanding user preferences. In these studies, researchers aim at extracting gestures which are desirable by users for different kinds of interfaces. During this process, researchers have to manually analyze many videos which is a tiring and a time-consuming process. Although current tools for video analysis provide annotation opportunity and features like automatic gesture analysis, researchers still need to (1) divide videos into meaningful pieces, (2) manually examine each piece, (3) match collected user data with these, (4) code each video and (5) verify their coding. These processes are burdensome and current tools do not aim to make this process easier and faster. To fill this gap, we developed "GestAnalytics" with features of simultaneous video monitoring, video tagging and filtering. Our internal pilot tests show that GestAnalytics can be a beneficial tool for researchers who practice video analysis for gestural interfaces.

 

Role: Concept Creator, Developer, Main Author, Design Researcher

Type: Full Paper

Conference: ACM Conference on Designing Interactive Systems Companion [h5-index: 31]

Date: 2017

Co-Authors: Oğuzhan Özcan (Advisor)

 

PDF Cite ACM
It Made More Sense: Comparison of User-elicited On-Skin Touch and Freehand Gesture Sets

Abstract

Research on gestural control interfaces is getting more widespread for the purpose of creating natural interfaces. Two of these popular gesture types are freehand and on-skin touch gestures, because they eliminate the use of an intermediary device. Previ-ous studies investigated these modalities separately with user-elicitation methods; however, there is a gap in the field considering their comparison. In this study, we compare user-elicited on-skin touch and freehand gesture sets to explore users’ pref-erences. Thus, we conducted an experiment in which we compare 13 gestures to con-trol computer tasks for each set. Eighteen young adults participated in our study and filled our survey consisted of NASA Task Load Index and 4 additional items of social acceptability, learnability, memorability, and the goodness. The results show that on-skin touch gestures were less physically demanding and more socially acceptable compared to freehand ges-tures. On the other hand, freehand gestures were more intuitive than on-skin touch gestures. Overall, our results suggest that different gesture types could be useful in different scenarios. Our contribution to the field might inspire designers and develop-ers to make better judgments for designing new gestural interfaces for a variety of devices.

 

Role: Concept Creator, Supervisor

Type: Full Paper

Conference: Distributed, Ambient and Pervasive Interactions: 5th International Conference, DAPI 2017, Held as Part of HCI International 2017 [h5-index: 20]

Date: 2017

Co-Authors: Hayati Havlucu, M. Yarkın Ergin, İdil Bostan, Tilbe Göksun, Oğuzhan Özcan (Advisor)

 

PDF Cite Google Books