Research Areas

Haptic Rendering applied to Virtual Reality

In few words

Nowadays, virtual reality technologies are able to generate realistic images of virtual scenes that can hardly be distinguished from photographs. The next step towards increasing the level of reality is to make possible intuitively manipulating objects in a virtual scene, while feeding back haptic information to the human user.

The research project on haptic rendering focuses on this topic. A human user moves the bimanual haptic interface [HMI], which is coupled to virtual objects, while a haptic rendering algorithm detects collisions between virtual objects and computes appropriate collision forces. These forces are displayed to the user via the haptic interface.


On the other hand, processing the position of the virtual objects, an image of the virtual environment can be rendered and displayed to the user through a Head-Mounted Display or a 3D projection. Combining visual and haptic feedback makes possible for humans to deeply immerse into virtual reality simulations.

Haptic Rendering: the VPS Algorithm

The force computation in virtual reality simulations with haptic feedback is performed by a haptic rendering algorithm. In comparison to visual rendering, which requires update rates of at least 30Hz for smooth visual feedback, haptic signals must be updated at a rate of 1000Hz to obtain stable and realistic haptic feedback. The Voxmap-PointShell (VPS) Algorithm is able to achieve such high update rates for computing collision forces and torques of virtual objects. Even complex virtual environments consisting of several millions of triangles could be handled by the VPS algorithm.

The force computation of the VPS algorithm is based on two data-structures, which are generated from polygonal models in a preliminary step:

  • Voxmaps: voxelized volume structures arranged for simulating the static properties of the objects.
  • Pointshells, that represent dynamic or moving objects through clouds of points; each point is located on the surface of the polygonal model and possesses a normal vector pointing inwards the object.


Every time a point is inside a surface-voxel, a single collision is detected, being the corresponding single collision force computed immediately using the normal vectors and the penetration of the colliding point. Finally, all the collision forces summed together yield the total repulsion force.

The DLR is improving this haptic algorithm in terms of computing collision forces and generating haptic data structures. Furthermore, it has been extended for using two dynamic objects, taking advantage of the bimanual HMI.

Control strategies

In order to assure safe interaction and realistic feedback with our bimanual haptic device, stability must be guaranteed. Most theoretical approaches presented in the past dealing with ensuring stability for haptic interfaces are passivity-based approaches. Although ensuring passivity of haptic devices is a valid approach for tele-manipulation, it has the disadvantages of being conservative for haptic rendering in terms of stability and of requiring the presence of mechanical damping.

The problem has been tackled performing a detailed stability analysis of the system taking into account parameters of the haptic device, of the human operator, and of the virtual world, including time delay.

This analysis yields the set of stable parameters, which is shown in the figure beside as stable regions. The passive regions, which are defined by the aforementioned conservative passivity condition, are located as subsets inside the stable regions. On the basis of these stable regions, an easy-to-use linear stability condition has been derived.


Virtual reality setups enriched with haptic rendering technologies enable appealing applications for the automotive and aerospace industries. By means of these setups it is possible

  • to integrate into the product engineering steps knowledge of manufacturers that build the final product,
  • to check in early stages of the product design whether the models are correct in terms of assemblability,
  • to train mechanics in order to prepare them for future complex assembly tasks with fragile objects.

All this can be performed without using expensive physical mock-ups or prototypes, by using the DLR haptic technologies.


Above, an assembly verification example performed with our bimanual haptic interface.  A coolant tank is assembled into a VW Touran model using the left hand, while the right hand handles an electronic drill to fix the tank into its place. Three pairs of objects must be checked for collision (i,ii,iii).

A video of the application, here.

URL for this article