Hi,
As part of our project funded by the Scientific and Technological Research Council(TUBITAK), we have developed a haptic device based on Arduino Leonardo. We plan to use this hardware within Imstk Unity. How can I introduce it as a device?
Right now i can see two main ways to do this
DeviceManager
Implement a DeviceManger
and optionally a DeviceClient
. The DeviceManager
serves as the interface between your hardware and the imstk data in an instance of DeviceClient
or if necessary a subclass of it. The device manager usually runs in it own thread to handle the communication responsibilies. The new classes have to be added to the SWIG infrastructure so they become accessible in C# and then you will need to create the appropriate code in unity to access those. Both OpenHaptics and VRPN are exposed this way. You may or may not need to implement a separate DeviceClient
class. That might depend on timing behavior or some of your requirements.
VRPN Client
Imstk utilizes VRPN to access other devices, you could implement your device interface inside of VRPN. VRPN provides device independent channels like Position
Orientation
Force
etc … this completely isolates the HW specifics from the processing (imstk) side.
The current VRPN device handler doesn’t deal with forces (read/write) you’d also have to extend that. Otoh though i don’t think that you would have to extend anything past the VRPN Device (i.e. no swig extensions, no unity wrapping).
There is a streaming_arduino
device in the VRPN distribution but this only pushes data out from the arduino over serial communication. I don’t know about your device, you might be able to extend this to bi directional communication. You’d still have to extend the VRPN device to write “analog” channels
The Device
directory has the OpenHaptics and VRPN device drivers that you can use as reference
https://gitlab.kitware.com/iMSTK/iMSTK/-/tree/master/Source/Devices?ref_type=heads
I hope this helps