Nowadays, virtual reality technologies are able to generate realistic images of virtual scenes. The next step towards increasing the level of reality is the intuitive manipulation of objects in a virtual scene, while feeding back haptic collision information to the human user.
The telepresence and virtual reality group of the German Aerospace Center develops haptic rendering algorithms that enable such realtime virtual reality manipulation scenarios. The user can manipulate complex virtual objects via our bimanual haptic interface HUG; whenever objects collide against each other, the user perceives the corresponding collision forces. A use case is given in the following car part assembly video.
In contrast to visual rendering, which only requires update rates of at least 30Hz for smooth visual feedback, haptic signals must be updated at a challenging rate of 1000Hz to obtain stable and realistic haptic feedback. We use an algorithm based on two data structures: voxelized distance fields (voxmaps) and point-sphere hierarchies (pointshells). Our work is inspired by the haptic rendering approach introduced by the Voxmap-PointShell (VPS) Algorithm, which allows for realtime collision feedback even with objects consisting of several millions of triangles.
By using virtual reality scenarios with such haptic rendering technologies it is possible, for instance,