Navigation on the Web with Haptic Feedback
Symposium:
- ISEA96: Seventh International Symposium on Electronic Art
- More presentations from ISEA96:
Presentation Title:
- Navigation on the Web with Haptic Feedback
Presenter(s):
Abstract:
Poster Statement
Multimodal systems present a new challenge of adapting computer inputs and outputs by using different media and modalities, depending on the user’ s tasks, skills, and preferences. Multimodal user interfaces provide users with multiple kinds of events, such as graphics, sounds (not only none-spoken sounds but also speech-synthesis) and haptic (or force) feedback. Haptic feedback is an emerging concept which consist of stimulating both tactile and kinesthetic senses. A haptic device is like a graphical display but instead of creating visible information, it creates touchable information. In addition, like computer mice, a haptic device is an input device: it allows a user to point to, select or manipulate computerized objects. However, unlike visual (or acoustic) displays, there is no universal haptic device, hence they must be designed and developed taking into account both technical and psycho-physical constraints. Haptic interfaces have numerous applications, such as virtual environments, artistic creation, tele-operated systems, computer access for handicapped or micro-gravity environments. Remote Multimodal Interaction Haptic feedback technology offers a new performance support system paradigm. Since a haptic device is used as the pointing device, it allows a system to physically guide the user’s hand: for instance, quickly pointing to small icons, selecting the thin window frames, opening pop-up menus and selecting the right items. Our system is composed of two (or more) workstations running Windows 95 linked via a network. A user is put into a work situation and uses a haptic device as the pointing device. A remote peer monitors the user via a second machine which reproduces all or part of the first machine’s reciprocal objects and actions. Thus, the peer is able to analyze and estimate, in real time, how and when it should take control of the haptic device and guide or suggest actions to the user. For example, in a training stage, the peer may notice that the user has difficulties with navigation tasks. The peer will then realize these tasks using a pointing device (e.g. a mouse). Meanwhile, remote haptic device will move according to the peer’s mouse motions, giving the user physical guidelines. Since the peer also has a force feedback device, remote haptic interaction is bilateral.