GestUI: A Model-driven Method and Tool for Including Gesture-based Interaction in User Interfaces
Complex Systems Informatics and Modeling Quarterly 2016
Otto Parra, Sergio Espana, Oscar Pastor

Among the technological advances in touch-based devices, gesture-based interaction have become a prevalent feature in many application domains. Information systems are starting to explore this type of interaction. As a result, gesture specifications are now being hard-coded by developers at the source code level that hinders their reusability and portability. Similarly, defining new gestures that reflect user requirements is a complex process. This paper describes a model-driven approach to include gesture-based interaction in desktop information systems. It incorporates a tool prototype that captures user-sketched multi-stroke gestures and transforms them into a model by automatically generating the gesture catalogue for gesture-based interaction technologies and gesture-based user interface source codes. We demonstrated our approach in several applications ranging from case tools to form-based information systems.


Atslēgas vārdi
Model-driven architecture; gesture-based interaction; multi-stroke gestures; information systems; gesture-based user interface
DOI
10.7250/csimq.2016-6.05
Hipersaite
https://csimq-journals.rtu.lv/article/view/csimq.2016-6.05

Parra, O., Espana, S., Pastor, O. GestUI: A Model-driven Method and Tool for Including Gesture-based Interaction in User Interfaces. Complex Systems Informatics and Modeling Quarterly, 2016, No.6, 73.-92.lpp. e-ISSN 2255-9922. Pieejams: doi:10.7250/csimq.2016-6.05

Publikācijas valoda
English (en)
RTU Zinātniskā bibliotēka.
E-pasts: uzzinas@rtu.lv; Tālr: +371 28399196