Nazemi, Kawa; Burkhardt, Dirk
In: Bebis, George; Boyle, Richard; Parvin, Bahram; Koracin, Darko; Ushizima, Daniela; Chai, Sek; Sueda, Shinjiro; Lin, Xin; Lu, Aidong; Thalmann, Daniel; Wang, Chaoli; Xu, Panpan (Ed.): Advances in Visual Computing, pp. 283–294, Springer International Publishing, Cham, 2019, ISBN: 978-3-030-33723-0.
Tags: Artificial Intelligence, Data Analytics, Human Factors, Human-Centered Interfaces, Human-Computer Interaction, Information Visualization, Intelligent Systems, maschine learning, Visual Analytics| | |
Visual Analytics provides with a combination of automated techniques and interactive visualizations huge analysis possibilities in technology and innovation management. Thereby not only the use of machine learning data mining methods plays an important role. Due to the high interaction capabilities, it provides a more user-centered approach, where users are able to manipulate the entire analysis process and get the most valuable information. Existing Visual Analytics systems for Trend Analytics and technology and innovation management do not really make use of this unique feature and almost neglect the human in the analysis process. Outcomes from research in information search, information visualization and technology management can lead to more sophisticated Visual Analytics systems that involved the human in the entire analysis process. We propose in this paper a new interaction approach for Visual Analytics in technology and innovation management with a special focus on technological trend analytics.
Nazemi, Kawa; Burkhardt, Dirk; Praetorius, Alexander; Breyer, Matthias; Kuijper, Arjan
In: Kurosu, Masaaki (Ed.): Human Centered Design, pp. 566–575, Springer, Berlin, Heidelberg, Germany, 2011, ISBN: 978-3-642-21753-1.
Today the information visualization takes in an important position, because it is required in nearly every context where large databases have to be visualized. For this challenge new approaches are needed to allow the user an adequate access to these data. Static visualizations are only able to show the data without any support to the users, which is the reason for the accomplished researches to adaptive user-interfaces, in particular for adaptive visualizations. By these approaches the visualizations were adapted to the users' behavior, so that graphical primitives were change to support a user e.g. by highlighting user-specific entities, which seems relevant for a user. This approach is commonly used, but it is limited on changes for just a single visualization. Modern heterogeneous data providing different kinds of aspects, which modern visualizations try to regard, but therefore a user often needs more than a single visualization for making an information retrieval. In this paper we describe a concept for adapting the user-interface by selecting visualizations in dependence to automatically generated data characteristics. So visualizations will be chosen, which are fitting well to the generated characteristics. Finally the user gets an aquatically arranged set of visualizations as initial point of his interaction through the data.
Burkhardt, Dirk; Breyer, Matthias; Glaser, Christian; Nazemi, Kawa; Kuijper, Arjan
In: Stephanidis, Constantine (Ed.): Universal Access in Human-Computer Interaction. Design for All and eInclusion, pp. 20–29, Springer, Berlin, Heidelberg, Germany, 2011, ISBN: 978-3-642-21672-5.
Nowadays a wide range of input devices are available to users of technical systems. Especially modern alternative interaction devices, which are known from game consoles etc., provide a more natural way of interaction. But the support in computer programs is currently a big challenge, because a high effort is to invest for developing an application that supports such alternative input devices. For this fact we made a concept for an interaction system, which supports the use of alternative interaction devices. The interaction-system consists as central element a server, which provides a simple access interface for application to support such devices. It is also possible to address an abstract device by its properties and the interaction-system overtakes the converting from a concrete device. For realizing this idea, we also defined a taxonomy for classifying interaction devices by its interaction method and in dependence to the required interaction results, like recognized gestures. Later, by using this system, it is generally possible to develop a user-centered system by integrating this interaction-system, because an adequate integration of alternative interaction devices provides a more natural and easy to understand form of interaction.
Nazemi, Kawa; Burkhardt, Dirk; Stab, Christian; Breyer, Matthias; Wichert, Reiner; Fellner, Dieter W.
In: Wichert, Reiner; Eberhardt, Birgid (Ed.): Ambient Assisted Living: 4. AAL-Kongress 2011, Berlin, Germany, January 25–26, 2011, Chapter 6, pp. 75–90, Springer, Berlin, Heidelberg, Germany, 2011, ISBN: 978-3-642-18167-2.
Using modern interaction methods and devices enables a more natural and intuitive interaction. Currently, only mobile phones and game consoles which are supporting such gesture-based interactions have good payment-rates. This comes along, that such devices will be bought not only by the traditional technical experienced consumers. The interaction with such devices becomes so easy, that also older people playing or working with them. Especially older people have more handicaps, so for them it is difficult to read small text, like they are used as description to buttons on remote controls for televisions. They also become fast overstrained, so that bigger technical systems are no help for them. If it is possible to interact with gestures, all these problems can be avoided. But to allow an intuitive and easy gesture interaction, gestures have to be supported, which are easy to understand. Because of that fact, in this paper we tried to identify intuitive gestures for common interaction scenarios on computer-based systems for uses in ambient assisted environment. In this evaluation, the users should commit their opinion of intuitive gestures for different presented scenarios/tasks. Basing on these results, intuitively useable systems can be developed, so that users are able to communicate with technical systems on more intuitive level using accelerometer-based devices.