Virtual Reality Simulators for Endoscopic Sinus and Skull Base Surgery: The Present and Future

Article information

Clin Exp Otorhinolaryngol. 2019;12(1):12-17
Publication date (electronic) : 2018 October 18
doi : https://doi.org/10.21053/ceo.2018.00906
1Department of Otolaryngology-Head and Neck Surgery, Seoul St. Mary’s Hospital, College of Medicine, The Catholic University of Korea, Seoul, Korea
2Department of Neurosurgery, Seoul St. Mary’s Hospital, College of Medicine, The Catholic University of Korea, Seoul, Korea
Corresponding author: Sung Won Kim Department of Otolaryngology-Head and Neck Surgery, Seoul St. Mary’s Hospital, College of Medicine, The Catholic University of Korea, 222 Banpo-daero, Seocho-gu, Seoul 06591, Korea Tel: +82-2-2258-6216, Fax: +82-2-535-1354 E-mail: kswent@catholic.ac.kr
Received 2018 June 11; Revised 2018 August 24; Accepted 2018 August 24.

Abstract

Endoscopic sinus and skull base surgeries are minimally invasive surgical techniques that reduce postoperative symptoms and complications and enhance patients’ quality of life. However, to ensure excellent surgical outcomes after such interventions, intimate familiarity with important landmarks and high-level endoscope manipulation skills are essential. Cadaver training is one possible option, but cadavers are expensive, scarce, and nonreusable and cadaver work requires specialized equipment and staff. In addition, it is difficult to mimic specific diseases using cadavers. Virtual reality simulators can create a computerized environment in which the patient’s anatomy is reproduced and interaction with endoscopic handling and realistic haptic feedback is possible. Moreover, they can be used to present scenarios that improve trainees’ skills and confidence. Therefore, virtual simulator training can be implemented at all levels of surgical education. This review introduces the current literature on virtual reality training for endoscopic sinus and skull base surgeons, and discusses the direction of future developments.

INTRODUCTION

In the time since endoscopes were initially used during paranasal sinus and skull base surgery, the indications for endoscopic surgery have been extended; minimally invasive surgical techniques reduce postoperative symptoms and complications and enhance the quality of life of patients [1,2]. Compared to the traditional microscope, the endoscope provides a larger field of view when treating pituitary adenomas and skull base conditions. However, to ensure excellent surgical outcomes after endoscopic endonasal intervention, intimate familiarity with important landmarks and high-level endoscope manipulation skills (including avoidance of instrument tangles in a narrow space) are essential. In particular, during skull base surgery, carotid artery or other vascular injuries can be debilitating or fatal, and cranial nerve injury can cause catastrophic disability. Although residents are trained to perform endoscopic surgery under close supervision, many critical procedures are still performed by the attending surgeon. Therefore, it is essential to improve the learning curve for various surgical conditions. Cadaver training is one possible option, but cadavers are expensive, scarce, and nonreusable and cadaver work requires specialized equipment and staff. In addition, it is difficult to mimic specific diseases using cadavers. Three-dimensional (3D) printed models and other alternatives have been developed, but are expensive and have limitations similar to those of cadaver training. Virtual reality can be used to present scenarios that improve trainee skills and confidence. Therefore, virtual simulator training can be implemented at all levels of surgical education. Here, we review the current literature on virtual reality training for endoscopic sinus and skull base surgeons, and discuss the directions of future developments.

VIRTUAL SIMULATORS OF ENDOSCOPIC SINUS AND SKULL BASE SURGERY

The available virtual simulators are listed in Table 1.

The virtual simulators for endoscopic sinus and skull base surgery

Endoscopic sinus surgery simulator

The endoscopic sinus surgery simulator (ES3) was developed in 1997 by Lockheed Martin (Bethesda, MD, USA) [3-5] to provide a virtual surgical environment in which a trainee manipulates an endoscope and other instruments inside the nasal cavity of a mannequin. All tools afford touch feedback. The simulator features three levels of difficulty, allowing effective step-by-step practice. The entire performance is recorded and scored. A penalty is applied when the trainee exceeds a time limit or damages important anatomical structures such as the optic nerve. Although there have been many validation studies [6-15], development costs have rendered the simulator very expensive, and it is no longer available.

Nasal endoscopy simulator

The nasal endoscopy simulator (NES), developed in Germany, was also introduced in 1997 [16,17] to simulate endoscopic sinus surgery. The user explores a model of the head using actual surgical instruments, and an electromagnetic tracking system is operative. Unlike the ES3, haptic feedback is absent. The NES tracks motion, detects collisions between instruments, and simulates tissue deformation. Errors (such as clashes of endoscopic tools or damage to crucial tissues) are recorded. This simulator also provides three levels of difficulty.

Dextroscope

The Dextroscope was developed in Singapore in 2003 [18,19]. The workstation features stereoscopic glasses, a stylus, and a control handle (a joystick). The use of 3D glasses and a mirror enable 3D interaction with patient-specific computed tomography (CT) and/or magnetic resonance imaging data. The Dextroscope does not use a model head. The joystick operates the endoscope and the stylus operates the microdebrider or drill. The 3D graphics reflect the anatomical shape; however, force feedback is lacking, as are physiological data such as injection blanching or bleeding. The model is rather stiff, preventing gentle displacement of certain structures.

Simulation of transsphenoidal endoscopic pituitary surgery

The VRVis Research Center and Department of Neurosurgery of the Medical University of Vienna collaborated to develop the virtual endoscopic simulation of transsphenoidal endoscopic pituitary surgery (STEPS) system between 2004 and 2009 [20-22]. This system trains surgeons in the preoperative planning and performance of transsphenoidal pituitary surgery. STEPS has been integrated into the Impax EE PACS system (Agfa HealthCare, Bonn, Germany). Preoperatively, the nasal cavity and skull base are reconstructed in a 3D virtual endoscopic environment. The crucial anatomical structures are displayed in the background behind semitransparent boundaries, and surface rigidity is color-coded. The user selects anatomical structures to be displayed, views them from various angles, modifies visual factors, defines landmarks, and removes tissues. Angled endoscopes and other simulated instruments such as rongeurs are used to replicate the transsphenoidal approach. To prevent instrument manipulations that are impossible in the real world, instrument collisions are detected and force feedback is delivered. In 2010, STEPS was upgraded using the Stealth Station image-guided navigation system (Medtronic, Minneapolis, MS, USA) [23]. Currently, the endoscope and additional instruments are optically tracked during interventions, and their positions are passed to STEPS, which generate virtual images.

CardinalSim

The CardinalSim, which simulates endoscopic sinus surgery, was developed in 2009 [24]. Patient-specific endonasal anatomy is reconstructed from CT data, allowing preoperative virtual flythrough. A model of an actual endoscope attached to a haptic device is available. The user can explore the virtual endonasal anatomy and using a virtual microdebrider, remove tissue during simulated surgery. Force feedback creates resistance as the tip of the instrument contacts the reconstructed anatomy. A recent report claimed that construction of a usable model from a CT scan required 1–2 hours depending on the number and complexity of segmented structures [25]. Preoperative surgical rehearsal of sinus and skull base surgery can be conducted using standard PC hardware and a commercial haptic device.

VOXEL-MAN SinuSurg

The VOXEL-MAN SinuSurg was developed by a German research group in 2010 [26]. The basic layout is that of the VOXEL-MAN TempoSurg simulator of temporal bone surgery [27]. The simulator runs on standard PC hardware and features the “Phantom Omni” haptic device (SensAble Technologies, Woburn, MA, USA). Customized algorithms allow subvoxel visualization, volume cutting, and haptic rendering.

Flinders sinus surgery simulator

The Flinders sinus surgery simulator was developed in 2013 and features two haptic manipulators [28-30]. The “Novint Falcon” (Novint Technologies, Delaware, USA) senses the 3D position and provides linear haptic feedback. A controllable 0° endoscope is attached. The Phantom Omni haptic device (SensAble Technologies) senses both the 3D position and orientation, also providing linear haptic feedback to control surgical instruments. Collision detection and force feedback are available; the textures of mucosa and deformable tissues are rendered realistic using voxel- and triangle-based surface mesh models. Blinn-Phong lighting [31] is used to shade flat featureless areas. The action of vasoconstrictive drugs is simulated to increase the space available for endoscopic instrumentation.

NeuroTouch Endo

Engineers and neurosurgeons of the National Research Council of Canada collaborated to develop the NeuroTouch neurosurgical simulator (Fig. 1) [32,33]. NeuroTouch offers various training scenarios based on real cases; transsphenoidal, endonasal pituitary surgery is included. The first version allows the user to practice endoscopic manipulation and detect the sphenoid ostium. The user inserts a virtual endoscope affording haptic feedback into the nostril, and explores the simulated endonasal anatomy. When the instrument contacts tissue, deformation is computed based on real-time physical feedback. When the tip of the endoscope touches the mucosa, the lens can be stained, blurring the view; forceful contact can trigger bleeding, obstructing the view. Stains or blood can be rinsed by pressing a pedal. Important anatomical landmarks, such as the septum and ostium, can be labeled and the labels become visible as the endoscope approaches the structures. The user can toggle between an interior and exterior view of the nasal cavity, which is particularly useful for initially inserting the endoscope into the nostril. During simulation, the trainee advances the endoscope to the sphenoid ostium, opens the ostium, and then enters the other nostril to find the opposite ostium. When the trainee reaches the first ostium, a “mission completed” message appears. The simulation ends when both sphenoid ostia are found, and performance is scored by reference to the number of targets attained, the force used, and the time taken.

Fig. 1.

NeuroTouch (National Research Council of Canada, Ottawa, Canada). The NeuroTouch simulator features bimanual haptic manipulators. Real-time physics-based computations of tissue deformation are available. The nostril view is shown on the bottom right of the monitor.

McGill simulator for endoscopic sinus surgery

The McGill simulator was developed by McGill University and the National Research Council of Canada [34]. The simulator features the NeuroTouch platform upgraded to a bimanual system; the trainee uses both an endoscope and other instruments as in real life. The hardware was improved, a mannequin head was added, the distance between the handle and the action axis of the haptic device was shortened, and additional haptic devices were inserted in the head using shafts [35]. However, insertion of two haptic devices (the endoscope and an instrument) in the ipsilateral nostril is impossible, because the action axis of the Phantom Omni haptic device is too thick.

DISCUSSION

Surgical simulators play important roles in the preoperative assessment of patient anatomy, reducing the risk of injury to anatomical structures near the surgical target, and helping novice surgeons overcome the steep learning curve associated with endoscopic procedures. Several virtual simulators of endoscopic sinus and skull base surgery are available. Future improvements will render cadaver training irrelevant.

Patient-specific algorithms for preoperative planning are needed. Usually, an anatomical model is reconstructed from radiological images. Such images are often inaccurate, even when the slices used for reconstruction are very thin. To correctly depict structures of interest, it is often necessary to resort to manual segmentation, which is time-consuming and tedious. Furthermore, the final visual result depends on the so-called transfer function, which maps the intensities of the radiological source materials to colors and levels of visibility. Selection of an inappropriate transfer function causes structures to be missed or displayed incorrectly. Therefore, software resolving these issues is essential and efforts are underway.

Haptic devices improving real-life surgical skills are required. Many simulators feature commercial haptic devices. However, freely moving devices do not adequately improve skills and confidence. Laparoscope haptic devices are being vigorously developed. A laparoscope enters via a “port”; several mutually distant ports may be used. If an instrument port is available, haptic delivery and tracking are simple. However, the nostril is a hole, not a port. In addition, the endoscope and instruments are inserted into the ipsilateral nostril during sinus and skull base surgery; the three instruments must therefore operate very closely together during endoscopic skull base surgery using the two-nostrils/four-hands technique (Fig. 2) [36]. Therefore, the development of haptic devices is very challenging. However, training on how an endoscope and instruments interact in the nostril is very important, particularly when the endoscope is angled and the instruments are curved. The McGill simulator for endoscopic sinus surgery haptic devices improve constantly. However, one nostril-two instrument products (thin light instruments) affording realistic microsurgery feedback are essential.

Fig. 2.

The two-nostrils/four-hands technique for endoscopic skullbase surgery. Three instruments operate in very close proximity.

Finally, an assessment system allowing novice surgeons to climb the learning curve is required. Validation testing of, and improvements in, software and hardware are ongoing, but the work is time-consuming and expensive. ES3 has been extensively validated and assessed. To balance cost-effectiveness and quality control, software and hardware that can be readily upgraded, and an interactive interface such as a Content Service Platform, are needed. A Content Service Platform is a service platform for a simulation system that is linked to an online network. It enables management of the contents, assessment data, and account information of the users and running of a coaching community.

The points noted above form the general goal of virtual reality simulators. However, depending on the particular simulator’s objective, the main points of focus of the simulators are different. Surgical rehearsal programs should be more focused on the fast and realistic conversion of images. On the other hand, surgical training simulators should be more focused on scenario and assessment systems.

Several virtual reality simulators mimicking real surgical settings are available and have improved over time. However, there is a greater requirement for optimized rendering software, custom haptic devices for endonasal surgery, and appropriate trainee assessment. Currently, cadaver training is optimal, but is costly and difficult to access. In the future, virtual reality simulators will render cadaver training obsolete.

HIGHLIGHTS

▪ There is a learning curve associated with endoscopic sinus and skull base surgery.

▪ Virtual reality simulators can be an effective training tool for novice surgeons in reducing the learning curve.

▪ Optimized software, custom-made devices, and proper assessment are required in this field.

Notes

No potential conflict of interest relevant to this article was reported.

Acknowledgements

This work was supported by Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Education (2017R1D1A1B03027903), and the Bio & Medical Technology Development Program of the NRF funded by the Ministry of Science and ICT (2018M3A9E80208 56), and the Korea Health Industry Development Institute funded by the Ministry of Health and Welfare (HI14C3228, HI15C 0133), Republic of Korea.

References

1. Rotenberg B, Tam S, Ryu WH, Duggal N. Microscopic versus endoscopic pituitary surgery: a systematic review. Laryngoscope 2010;Jul. 120(7):1292–7.
2. Strychowsky J, Nayan S, Reddy K, Farrokhyar F, Sommer D. Purely endoscopic transsphenoidal surgery versus traditional microsurgery for resection of pituitary adenomas: systematic review. J Otolaryngol Head Neck Surg 2011;Apr. 40(2):175–85.
3. De Nicola M, Salvolini L, Salvolini U. Virtual endoscopy of nasal cavity and paranasal sinuses. Eur J Radiol 1997;May. 24(3):175–80.
4. Edmond CV Jr, Heskamp D, Sluis D, Stredney D, Sessanna D, Wiet G, et al. ENT endoscopic surgical training simulator. Stud Health Technol Inform 1997;39:518–28.
5. Rudman DT, Stredney D, Sessanna D, Yagel R, Crawfis R, Heskamp D, et al. Functional endoscopic sinus surgery training simulator. Laryngoscope 1998;Nov. 108(11 Pt 1):1643–7.
6. Arora H, Uribe J, Ralph W, Zeltsan M, Cuellar H, Gallagher A, et al. Assessment of construct validity of the endoscopic sinus surgery simulator. Arch Otolaryngol Head Neck Surg 2005;Mar. 131(3):217–21.
7. Edmond CV Jr. Impact of the endoscopic sinus surgical simulator on operating room performance. Laryngoscope 2002;Jul. 112(7 Pt 1):1148–58.
8. Edmond CV Jr, Wiet GJ, Bolger B. Virtual environments: surgical simulation in otolaryngology. Otolaryngol Clin North Am 1998;Apr. 31(2):369–81.
9. Fried MP, Sadoughi B, Gibber MJ, Jacobs JB, Lebowitz RA, Ross DA, et al. From virtual reality to the operating room: the endoscopic sinus surgery simulator experiment. Otolaryngol Head Neck Surg 2010;Feb. 142(2):202–7.
10. Fried MP, Sadoughi B, Weghorst SJ, Zeltsan M, Cuellar H, Uribe JI, et al. Construct validity of the endoscopic sinus surgery simulator. II. Assessment of discriminant validity and expert benchmarking. Arch Otolaryngol Head Neck Surg 2007;Apr. 133(4):350–7.
11. Fried MP, Satava R, Weghorst S, Gallagher AG, Sasaki C, Ross D, et al. Identifying and reducing errors with surgical simulation. Qual Saf Health Care 2004;Oct. 13 Suppl 1:i19–26.
12. Kassam AB, Prevedello DM, Carrau RL, Snyderman CH, Thomas A, Gardner P, et al. Endoscopic endonasal skull base surgery: analysis of complications in the authors’ initial 800 patients. J Neurosurg 2011;Jun. 114(6):1544–68.
13. Solyar A, Cuellar H, Sadoughi B, Olson TR, Fried MP. Endoscopic Sinus Surgery Simulator as a teaching tool for anatomy education. Am J Surg 2008;Jul. 196(1):120–4.
14. Uribe JI, Ralph WM Jr, Glaser AY, Fried MP. Learning curves, acquisition, and retention of skills trained with the endoscopic sinus surgery simulator. Am J Rhinol 2004;Mar-Apr. 18(2):87–92.
15. Weghorst S, Airola C, Oppenheimer P, Edmond CV, Patience T, Heskamp D, et al. Validation of the Madigan ESS simulator. Stud Health Technol Inform 1998;50:399–405.
16. Bockholt U, Muller W, Voss G, Ecke U, Klimek L. Real-time simulation of tissue deformation for the nasal endoscopy simulator (NES). Comput Aided Surg 1999;4(5):281–5.
17. Ecke U, Klimek L, Muller W, Ziegler R, Mann W. Virtual reality: preparation and execution of sinus surgery. Comput Aided Surg 1998;3(1):45–50.
18. Caversaccio M, Eichenberger A, Hausler R. Virtual simulator as a training tool for endonasal surgery. Am J Rhinol 2003;Sep-Oct. 17(5):283–90.
19. Kockro RA, Stadie A, Schwandt E, Reisch R, Charalampaki C, Ng I, et al. A collaborative virtual reality environment for neurosurgical planning and training. Neurosurgery 2007;Nov. 61(5 Suppl 2):379–91.
20. Neubauer A, Wolfsberger S, Forster MT, Mroz L, Wegenkittl R, Buhler K. Advanced virtual endoscopic pituitary surgery. IEEE Trans Vis Comput Graph 2005;Sep-Oct. 11(5):497–507.
21. Wolfsberger S, Forster MT, Donat M, Neubauer A, Buhler K, Wegenkittl R, et al. Virtual endoscopy is a useful device for training and preoperative planning of transsphenoidal endoscopic pituitary surgery. Minim Invasive Neurosurg 2004;Aug. 47(4):214–20.
22. Wolfsberger S, Neubauer A, Buhler K, Wegenkittl R, Czech T, Gentzsch S, et al. Advanced virtual endoscopy for endoscopic transsphenoidal pituitary surgery. Neurosurgery 2006;Nov. 59(5):1001–9.
23. Schulze F, Buhler K, Neubauer A, Kanitsar A, Holton L, Wolfsberger S. Intra-operative virtual endoscopy for image guided endonasal transsphenoidal pituitary surgery. Int J Comput Assist Radiol Surg 2010;Mar. 5(2):143–54.
24. Parikh SS, Chan S, Agrawal SK, Hwang PH, Salisbury CM, Rafii BY, et al. Integration of patient-specific paranasal sinus computed tomographic data into a virtual surgical environment. Am J Rhinol Allergy 2009;Jul-Aug. 23(4):442–7.
25. Won TB, Hwang P, Lim JH, Cho SW, Paek SH, Losorelli S, et al. Early experience with a patient-specific virtual surgical simulation for rehearsal of endoscopic skull-base surgery. Int Forum Allergy Rhinol 2018;Jan. 8(1):54–63.
26. Tolsdorff B, Pommert A, Hohne KH, Petersik A, Pflesser B, Tiede U, et al. Virtual reality: a new paranasal sinus surgery simulator. Laryngoscope 2010;Feb. 120(2):420–6.
27. Zirkle M, Roberson DW, Leuwer R, Dubrowski A. Using a virtual reality temporal bone simulator to assess otolaryngology trainees. Laryngoscope 2007;Feb. 117(2):258–63.
28. Dharmawardana N, Ruthenbeck G, Woods C, Elmiyeh B, Diment L, Ooi EH, et al. Validation of virtual-reality-based simulations for endoscopic sinus surgery. Clin Otolaryngol 2015;Dec. 40(6):569–79.
29. Diment LE, Ruthenbeck GS, Dharmawardana N, Carney AS, Woods CM, Ooi EH, et al. Comparing surgical experience with performance on a sinus surgery simulator. ANZ J Surg 2016;Dec. 86(12):990–5.
30. Ruthenbeck GS, Hobson J, Carney AS, Sloan S, Sacks R, Reynolds KJ. Toward photorealism in endoscopic sinus surgery simulation. Am J Rhinol Allergy 2013;Mar-Apr. 27(2):138–43.
31. Guetat A, Ancel A, Marchesin S, Dischler JM. Pre-integrated volume rendering with non-linear gradient interpolation. IEEE Trans Vis Comput Graph 2010;Nov-Dec. 16(6):1487–94.
32. Choudhury N, Gelinas-Phaneuf N, Delorme S, Del Maestro R. Fundamentals of neurosurgery: virtual reality tasks for training and evaluation of technical skills. World Neurosurg 2013;Nov. 80(5):e9–19.
33. Rosseau G, Bailes J, del Maestro R, Cabral A, Choudhury N, Comas O, et al. The development of a virtual simulator for training neurosurgeons to perform and perfect endoscopic endonasal transsphenoidal surgery. Neurosurgery 2013;Oct. 73 Suppl 1:85–93.
34. Varshney R, Frenkiel S, Nguyen LH, Young M, Del Maestro R, Zeitouni A, et al. Development of the McGill simulator for endoscopic sinus surgery: a new high-fidelity virtual reality simulator for endoscopic sinus surgery. Am J Rhinol Allergy 2014;Jul-Aug. 28(4):330–4.
35. Varshney R, Frenkiel S, Nguyen LH, Young M, Del Maestro R, Zeitouni A, et al. The McGill simulator for endoscopic sinus surgery (MSESS): a validation study. J Otolaryngol Head Neck Surg 2014;Oct. 43:40.
36. Castelnuovo P, Pistochini A, Locatelli D. Different surgical approaches to the sellar region: focusing on the “two nostrils four hands technique”. Rhinology 2006;Mar. 44(1):2–7.

Article information Continued

Fig. 1.

NeuroTouch (National Research Council of Canada, Ottawa, Canada). The NeuroTouch simulator features bimanual haptic manipulators. Real-time physics-based computations of tissue deformation are available. The nostril view is shown on the bottom right of the monitor.

Fig. 2.

The two-nostrils/four-hands technique for endoscopic skullbase surgery. Three instruments operate in very close proximity.

Table 1.

The virtual simulators for endoscopic sinus and skull base surgery

No. Simulator Year Application and main focusing object Hardware, haptic device characters Software characters References and validation study
1 Endoscopic sinus surgery simulator (ES3; Lockheed Martin, Bethesda, MD, USA) 1997 • FESS • Workstation simulation platform (Silicon Graphics, Mountain View, CA, USA) • Three modes (novice, intermediate, and advanced) associated with task complexity [3-15]
• Surgical training • PC-based haptic controller • The most extensively validated simulator
• Unilateral haptic manipulators
• Haptic feedback to instruments (except to the endoscope)
• A mannequin head
2 Nasal endoscopy simulator (Regensburg University Hospital, Regensburg, Germany) 1997 • FESS • Electromagnetic tracking system (sensors on the endoscope, instruments, and mannequin head) • Real-time collision detection and simulation of tissue deformation [16,17]
• Surgical training • No haptic feedback
• A mannequin head
3 Dextroscope (Volume Interactions, Singapore) 2003 • FESS • Workstation; mirrored display, stereoscopic glasses, stylus, and control handle (joystick) • Endoscope can turn from 0° to 360° or possible to magnify or reduce the objects [18,19]
• EETSA • No mannequin head
 Other endoscopic skull-base surgery
• Surgical rehearsal
4 Simulation of transsphenoidal endoscopic pituitary surgery (Medical University Vienna and VRVis Research Center, Vienna, Austria) 2004 • EETSA • Integrated into the Impax EE PACS system (Agfa Healthcare, Bonn, Germany) • Collision detection and force feedback [20-23]
• Surgical rehearsal • Stealth Station image-guided navigation system (Medtronic, Minneapolis, MN, USA); the endoscope and instruments are optically tracked • Can simulate angled endoscopes
• Control handle (joystick) • Preoperative visualization of important anatomical structures
5 CardinalSim (Stanford University, Stanford, CA, USA) 2009 • FESS • R니ns on standard PC hardware • Rapid reconstruction of patient-specific endonasal anatomy (1-2 hours) [24,25]
• EETSA • Features one haptic device • Real-time collision detection, simulation of tissue deformation, and force feedback
 Other endoscopic skull-base surgery • Accepts various commercial haptic devices
• Surgical rehearsal
6 VOXEL-MAN SinuSurg (University of Wurzburg, Wurzburg, Germany; Voxel-Man Group, University Medical Center Hamburg-Eppendorf, Hamburg, Germany; Helios Hospital Krefeld, Krefeld, Germany) 2010 • FESS • Runs on standard PC hardware • Customized algorithms for subvoxel visualization, volume cutting, and haptic rendering [26]
• Surgical training • Affords a stereoscopic view • Can accommodate angled endoscopes
• Fitted with the Phantom Omni haptic device (SensAble Technologies, Woburn, MA, USA)
7 Flinders sinus surgery simulator (Flinders University, Adelaide, Australia) 2013 • FESS • Bimanual haptic manipulators: Phantom Omni haptic devices (SensAble Technologies) and Novint Falcon (Novint Technologies, DE, USA) • Realistic mucosal texture and tissue deformation using voxel- and triangle-based surface mesh models [28-31]
• Surgical training • Runs on a laptop • Collision detection and force feedback
• No mannequin head • Shading algorithms
• Computer-generated effects of vasoconstrictive drugs
8 NeuroTouch Endo (National Research Council of Canada, Ottawa, Canada) 2013 • EETSA • Bimanual haptic manipulators: Phantom Omni devices • VR stereovision system; real-time physics-based computation of tissue deformation [32,33]
• Other endoscopic skull-base surgery • No mannequin head • Algorithms managing instrument-tissue contacts
• Surgical training
9 McGill simulator for endoscopic sinus surgery (National Research Council of Canada) 2014 • FESS • NeuroTouch platform • VR stereovision system; real-time physics-based computation of tissue deformation [34,35]
• Surgical training • Bimanual haptic manipulators: Phantom Omni devices with customized shafts • Algorithms managing instrument-tissue contacts
• A mannequin head

FESS, functional endoscopic sinus surgery; EETSA, endoscopic endonasal transsphenoidal approach; PACS, picture archiving and communication system; VR, virtual reality.