Game Development Reference
In-Depth Information
The use of haptic interfaces in professional applications is quite recent. For example, the
PHANToM management in CAD softwares started in late 1990s. As a result, the first
haptic libraries were restricted for a long time only to the APIs of haptic devices. These
APIs mostly focus on ensuring a dialogue between the application and a given interface,
but do not manage the environment. API GHOST is the most widely known and the
most complete of them all. It is based on the geometrical definition of the environment;
its base version offered an interaction (through virtual proxy) between a purely point-
based avatar and the triangles of the environment. It was possible to define some haptic
texture or “force shading'' effects. Since then, this API was extended to the management
of rigid body avatars. The environments managed by GHOST are however too simple
and the animation of the environment is essentially done by the programmer. Certain
commercial offers like the Reachin API by Reachin Technologies or e-Touch by Novit
Technologies offer modular graphic/haptic libraries that can potentially be adapted to
several interfaces.
Some physical simulation engines are available in the market since a few years.
Directly connecting the API of a haptic interface with this type of library is an attractive
idea. However, this connection often proves to be difficult due to extremely varied time
constraints. From the beginning of the 21st century, it seemed important to design
the simulation program and haptic loop together. Projects like i-Touch (Pocheville &
Kheddar, 2004) or SPORE (Meseure et al., 2004) combine mechanical simulation
and the control of haptic devices, irrespective of the haptic interface used. The i-Touch
project mainlymanages rigid bodies and targets CAD type applications. SPORE project
manages deformable bodies and is meant for surgical simulations.
In this chapter we saw how to adapt a haptic interface to a virtual environment, often
but not necessarily, based on the physics. We mainly focused on force rendering and
briefly introduced tactile rendering, which is currently being studied under a number of
research works. We also showed how to calculate haptic rendering and how to adapt
the calculation to frequency constraints which is an essential step to obtain a stable
rendering. The approaches used are often differentiated depending on which mode the
device functions - impedance or admittance. However, the god-object/virtual proxy
method makes it possible to unite the approaches by advocating an admittance type
interface irrespective of the interface.
A significant number of studies have focused on 3DOF interfaces, where the inter-
face moves only in translation motion and the avatar is limited to a point of mass.
This type of interaction already has several applications. Special effects like “phong
shading'' or haptic texture were mainly studied in this scope. However, shifting to a
rigid body avatar is essential for modelling interaction tools that are closer to real-world
tools. The current studies are thus trying to restore haptic sensations as realistically
as possible within the scope of complex interactions. New models are suggested in
order to include graphic and haptic requirements. Some studies even focus on a sin-
gle application (for example, sculpture, painting, etc.) to propose specific but highly
Search Nedrilad ::

Custom Search