Modern technologies and techniques are constantly opening up new possibilities for visualizing information on the Web. More and more graphic visualizations associated with fashion names such as Rich Internet Applications (RIA) play an important role. Parallel to this change on the Web, the growing use of alternative interaction devices can be observed. These interaction devices provide a more intuitive interaction metaphor through the use of natural human behavior, such as gesture recognition. Game console maker Nintendo’s push on WiiMote’s low-cost, gesture-based control device has fueled the trend toward using alternative interactive devices. Although these two technologies, on the one hand the graphical visualization of information on the Web and the other, the use of alternative interaction devices, have undergone rapid development independently of each other, there is no adequate connection between the two. Therefore the aim of the present work is the development of a method to network gesture-based interaction devices, in particular the WiiMote, with web-based graphical visualizations.
During my diploma thesis, WiiGesture was developed as a prototypical implementation to use alternative interaction devices – exemplified by the WiiMote – in web applications as part of a gesture interaction. To train gestures in a simplified way, a learning tool was created that stores the autonomously generated gestures in a configuration file. With the help of the additionally developed API each web application can now be extended by a gesture interaction. This API takes over all tasks, from reading in the configuration file to identification of executed gestures. Events are used to “communicate” the results to the application. With the help of the learning tool, it is also possible for a user later to learn their own intuitive gestures and to use them in an application.
This work serves as the basis for an extended interaction system, which can be extended generically with additional gesture-based interaction devices. In addition to the gesture recognition, further application methods should be possible.