Back to
Projects List
Integration of Haptic Device in 3D Slicer for Lumbar Puncture
Key Investigators
- Pablo Sergio Castellano Rodríguez (Universidad de Las Palmas de Gran Canaria, Spain)
- Jose Carlos Mateo Pérez (Universidad de Las Palmas de Gran Canaria, Spain)
- Juan Bautista Ruiz Alzola (Universidad de Las Palmas de Gran Canaria, Spain)
Presenter location: In-person
Project Description
The main objective of the project is to integrate the haptic device Touch 3D Systems into 3D Slicer through an OpenIGTLink connection with the Unity platform. Slicer To Touch is the 3D Slicer module that contains the scene with the 3D models of the spine and the needle. This module has an interface where the user can configure the number, position and value of the resistances to be exerted by the haptic device. These values will be included in a .json file that will later be transferred to Unity, which will process this data and configure the forces of the haptic device within the Unity environment. Finally, through the OpenIGTLink connection bridge, a real-time connection will be created, where the transformations and the resistances of the haptic device will be shared with the 3D Slicer scene. This idea comes from a project for a lumbar puncture training system that makes use of this device, but the body tissues, their location and thickness are generic. This way you can make a segmentation of a real patient’s back with its own characteristics and practice the lumbar puncture before doing it with the patient. Due to the way it works, it could be used in other procedures.
Objective
- Create a module that with the help of Unity and OpenIGTLink allows us to interact with a back model of a real patient obtained by segmentation of medical images. In this way we can train the lumbar puncture on the model of a real patient feeling the resistance in the body tissues.
- Automate the process of creating resistances on segmentation-generated models so that clinicians can easily perform lumbar puncture and other procedures with a sense of realism.
Approach and Plan
- Creation of the 3D Slicer module with fields to enter the number of resistances, positions and values.
- Generate a .json file with all the information entered in the module.
- Create a Unity project with a script that reads the generated .json file and creates a scene with the resistances in that position and with those values.
- Connect Unity to 3D Slicer through OpenIGTLink to send the transforms and see the needle movements in 3DSlicer.
- Generate an executable application from the Unity project with a simple look and feel that does the procedure automatically so that the clinical user finds it easy to use and does not have to deal with the unity interface.
- Do a documentation search of other procedures to check that the project works correctly in them. We are looking for other clinical procedures for which this project may be useful and for which there is information a
Progress and Next Steps
Steps that are already done:
- Create a 3d model of a real patient back from segmentation of medical images.
- Indicate with MarkUps the tissues which we want to feel.
- Include in SlicerToTouch module the number of resistances, posiitons and force value for each tissue. This module generates a json file with all the information.
- Using Unity in the background we read that file and automatically a new scene is created with the haptic materials. Also, at the same time, it sends the transform of the haptic device to slicer by OpenIGTlink, so you can see a needle moving.
What we are actually working on:
- Integration of Hololens 2 for the visualization of the scene:
Next steps:
- Include metrics in order to analyze the procedure.
- Restrict the movement to just one axis once you are inside the back model.
Illustrations
3D Slicer Module in which you enter the resistances (left) and the .json file with the information of these resistances (right)(Picture1.png)
Unity interface after reading the information from the .json file with the resistances created in the positions and the needle as a visual mesh of the haptic device (left) and script that makes it work (right) (Picture2.png)
Background and References