Bilkent University
Department of Computer Engineering
MS Thesis Presentation
Virtual Sculpting with Advanced Gestural Interface
Nurettin Çağrı Kılıboz
MS Student
Computer Engineering Department
Bilkent University
We propose a virtual reality application that can be utilized to design preliminary/conceptual models similar to real world clay sculpting. The proposed system makes use of the innovative gestural interface that enhances the experience of the human-computer interaction. The gestural interface employs advanced motion capture hardware namely data gloves and six degree of freedom motion trackers instead of classical input devices like keyboard or mouse. The design process takes places in the virtual environment that contains volumetric deformable model, design tools and a virtual hand that is driven by the data glove and the tracker with a sophisticated approach. The users manipulate the design tools and the deformable model via the virtual hand. The deformation on the model is done by stuffing or carving material (voxels) in or out of the model with the help of the tools or directly by the virtual hand. The virtual sculpting system also includes volumetric force feedback indicator that provide visual aid. We also offer a mouse like interaction approach in which the users can still interact with conventional graphical user interface items such as buttons with the data glove and tracker. The users can also command the application with gestural commands thanks to our real time trajectory based dynamic gesture recognition algorithm. The gesture recognition technique exploits a fast learning mechanism that does not require extensive training data to teach gestures to the system. We represent gestures as an ordered sequence of directional movements in 2D and use a six-degrees-of-freedom position tracker to collect trajectory data. In the learning phase, sample gesture data is filtered and processed to create gesture recognizers, which are basically finite-state machine sequence recognizers. We achieve online gesture recognition by these recognizers without needing to specify gesture start and end points. The results of the conducted user study show that the proposed method is very promising in term of gesture recognition performance in a stream of motion (73% accuracy) and user attitude assessment. The novel part of the proposed approach is that it gives users the freedom to create gesture commands according to their preferences for selected tasks. Thus, the complete interface makes the HCI process more intuitive, natural and user specific.
DATE: 26 August, 2013, Monday @ 10:00
PLACE: EA-502