Questions about transformation relationship between Touch device and visual tool

I added a probe tool model(.stl)to the scene and coupled it with Touch device, but how do I set the transformation relationship between the coordinate system of Touch device and the probe tool? I have the following questions: 1 What are the rules for establishing the coordinate system of the tool itself in imstk? (I created the tool model with SOLIDWORK) 2 What are the rules for establishing the coordinate system of the Touch device itself? 3. How to set the transformation relationship between Touch device and the tool coordinate system? Which sample programs describe the setting method?

The tool modeled in your program has its own local coordinate system. You can transform it there, or before giving it to imstk via:

Geometry::transform(<transform here>, Geometry::TransformType::ApplyToData);

To control it with a haptic device you need to set one of those up. A phantom omni through open haptics by default gives you [-1, 1] bounds on all axes (which physically corresponds to the space in front of it).

It is setup like so:

imstkNew<HapticDeviceManager>       hapticManager;
std::shared_ptr<HapticDeviceClient> client = hapticManager->makeDeviceClient();
driver->addModule(hapticManager);

You can directly access this positional and orientation data via:

client->getPosition();
client->getOrientation();

However in many of our examples we set the tool up as a rigid body with mass and inertia, then move it with forces, particularly a spring damper force, this can be done like so:

auto controller = std::make_shared<RigidObjectController>(myRigidObject, myHapticDeviceClient);
controller->setLinearKs(100000.0); // Linear spring force
controller->setAngularKs(300000000.0); // Angular spring force
controller->setUseCritDamping(true); // Critical damping, automatically determines suitable damping
controller->setForceScaling(0.001); // Scale the force the user feels
controller->setTranslationScaling(0.001); // Scale the translation of the device
controller->setTranslationOffset(Vec3d(0.0, 1.0, 0.0)); // Apply a world offset that moves it up
controller->setSmoothingKernelSize(10); // Using a running average of the last 10 forces for the current rendered force
scene->addController(controller);

There still two question:

  1. Can a deviceClient be connected a sepated scene object instead of scene? So we can control a single object.
  2. How to pick a object in scene? VTK picker is supported directly or not?

To control a specific object:

// Setup the tool
auto myLapTool = std::make_shared<PbdObject>();
// ... Setup my object ...

// Setup the controller that takes the device and tool to control
auto controller = mySpecificTool ->addComponent<PbdObjectController>();
controller->setControlledObject(myLapTool );
controller->setDevice(myDevice);
... Setup the controller ...

// Add the object to the scene
scene->addSceneObject(mySpecificTool);

I would suggest checking out the examples. Especially some of the haptics examples which demonstrate tool control (only built with OpenHaptics, Haply, or VRPN enabled). The VRLapToolSuturing example as well.

https://gitlab.kitware.com/iMSTK/iMSTK/-/blob/master/Examples/VirtualCoupling/VirtualCouplingExample.cpp

When you say pick do you mean

  • Pick a point on a geometry via a ray or other selection criteria.
  • Pick up an object with physics? Such that there is two-way physics interaction between the thing picked up and the thing picking up.

You can technically get the vtkRenderer and do all manner of VTK stuff from the VTKViewer if you like.

We do have our own “picking” classes. But we also have something we call “grasping” which is the physics enabled equivalent. (grasp a deformable tissue, grasp a rigid needle, etc)

I means there are multiple objects in the scene, however only the object under or nearst to my cursor is moved synchronisely with my mouse. My renderwindow is vtkGenericOpenGLRenderWindow based on the example “QtWindow”. Is there easy way to determine which object is under the cursor

This example allows you to pick up an object by clicking.

https://gitlab.kitware.com/iMSTK/iMSTK/-/blob/master/Examples/PBD/PBDClothGrab/pbdClothGrabExample.cpp