One challenge facing image-guided surgery systems is the problem of presenting 3D information on a 2D surface. As a potential solution, we are investigating the utility of sonification (transforming data into sound) for providing real-time and continuous distance information within the context of image-guided surgery. More specifically, we are interested in the ability of real-time sound synthesis to facilitate the perception of depth (i.e. “hearing depth”), which is the dimension that suffers the most in 2D visualizations. The sounds that we use in this research are inspired by sonification techniques that have been used within other commercial applications, such as the automative and aerospace industries.
Image-guided neurosurgery, or neuronavigation, has been used to visualize the location of a surgical probe by mapping the probe location to pre-operative models of a patient’s anatomy. One common limitation of this approach is that it requires the surgeon to divert their attention away from the patient and towards the neuronavigation system. In order to improve this type of application, we designed a system that sonifies (i.e. provides audible feedback of) distance information between a surgical probe and the location of the anatomy of interest. Results from our user studies are consistent with the idea that combining auditory distance cues with existing visual information from image-guided surgery system results in greater accuracy when locating specified points on a preoperative scan, and reduces the perceived difficulty in locating a target location within a 3D volume.
References:
Plazak, J., Drouin, S., Collins, D.L., Kersten-Oertel, M. (in press). Distance sonification in image-guided neurosurgery. Healthcare Technology Letters.