real-time surgical guidance, like example <Example-Femur>

We don’t provide any such integrations but I did toy with it today for a bit and got a working example on this branch. Just enables it in the build and provides an example, no API code. Might merge into imstk master soon. Untested on linux, branch likely to be rebased.

One of the issues with such a thing is that Qt and VTK use event based rendering, only rendering when an event notifying it to render is posted. This is great for UIs and such things that don’t render often but undesirable for highly interactive things like iMSTK and games where we’d prefer a more deterministic execution.

Luckily VTK allows one to make their own render calls allowing us to provide a more deterministic execution pipeline where we are always pushing frames (not event based). What I’ve done in my branch is thrown QApplication::processEvents before every render, this processes all Qt events before every render. The downside is that moving a slider, or doing anything in the UI that may cause many events would choke the simulation. So long as you aren’t fiddling with Qt whilst the simulation is running this would be useful. More useful would be to put the UI on a separate thread. The issues with this is any communication with the scene from UI may cause race conditions. One could start implementing synchronization mechanisms if they really wanted too.