Haptic Rendering: Collision Detection and Response

Nowadays, virtual reality technologies are able to generate realistic images of virtual scenes. The next step towards increasing the level of reality is the intuitive manipulation of objects in a virtual scene, while feeding back haptic collision information to the human user.

The telepresence and virtual reality group of the German Aerospace Center develops haptic rendering algorithms that enable such realtime virtual reality manipulation scenarios. The user can manipulate complex virtual objects via our bimanual haptic interface HUG; whenever objects collide against each other, the user perceives the corresponding collision forces. A use case is given in the following car part assembly video.

A Platform for Bimanual Virtual Assembly Training with Haptic Feedback in Large Multi-Object Environments
This video is an add on of the paper with the title presented at the ACM Symposium on Virtual Reality Software and Technology (VRST) 2016. We present a virtual reality platform which addresses and integrates some of the currently challenging research topics in the field of virtual assembly: realistic and practical scenarios with several complex geometries, bimanual six-DoF haptic interaction for hands and arms, and intuitive navigation in large workspaces. We put an especial focus on our collision computation framework, which is able to display stiff and stable forces in 1 kHz using a combination of penalty and constraint-based haptic rendering methods. Interaction with multiple arbitrary geometries is supported in realtime simulations, as well as several interfaces, allowing for collaborative training experiences. Performance results for an exemplary car assembly sequence which show the readiness of the system are provided.

In contrast to visual rendering, which only requires update rates of at least 30Hz for smooth visual feedback, haptic signals must be updated at a challenging rate of 1000Hz to obtain stable and realistic haptic feedback. We use an algorithm based on two data structures: voxelized distance fields (voxmaps) and point-sphere hierarchies (pointshells). Our work is inspired by the haptic rendering approach introduced by the Voxmap-PointShell (VPS) Algorithm, which allows for realtime collision feedback even with objects consisting of several millions of triangles.

Haptic Rendering: Collision Detection and Response
Haptic Rendering: Collision Detection and Response

By using virtual reality scenarios with such haptic rendering technologies it is possible, for instance,

  • to check in early stages of a product design whether different parts can be assembled optimally,
  • to integrate the knowledge of manufacturers that build the final product into the product engineering steps,
  • and to train mechanics in order to prepare them for future complex assembly tasks with fragile objects.

Related Interesting Links

Student Project Offers

  • Satellite On-Orbit Servicing in Virtual Reality with Haptic Feedback
  • Improvements of Data Structures for Collision Detection and Force Computation
  • Haptic Rendering for Virtual Reality: Evaluation and Improvements

Selected Publications