In this project, we are interested in creating the analog to the camera and microphone, but for the purposes of capturing mechanical models that can be rendered on a haptic interface. Such a device, called the haptic probe, can be realized by a motorized, instrumented probe and a method for characterizing the mechanical impedance (the feel) of both physical and virtual objects. The proposed device and identification method fill a significant need for automated modeling and verification in haptic display of virtual and teleoperated environments.

The probe would explore objects much like a human, pushing or moving them while recording the response motion or response interaction forces. Essentially, the proposed haptic probe is a mechanization of human haptic exploration. The data collected will be used to automatically construct models: to determine parameter values and hybrid model structure in a haptic rendering algorithm. Additionally, (and to great advantage) the probe could be used to compare the impedance of the resulting virtual object to the corresponding physical object.

We are especially interested in using data produced by the haptic probe to estimate parameter values for hybrid dynamical models, especially models of objects that make and break contact with one another. We are currently working on a family of model reference adaptive parameter estimators for hybrid dynamical systems to achieve this goal.


Research Project Member(s)

  Patoglu, Volkan
Gillespie, R. Brent
Related Project(s)
  The Haptic Probe
Extremal Distance Maintenance
On-line Symbolic Constraint Embedding
Research Project Papers
The Haptic Probe
Project Sponsors
  National Science Foundation