Virtual Reality Simulators for Endoscopic Sinus and Skull Base Surgery: The Present and Future
Article information
Abstract
Endoscopic sinus and skull base surgeries are minimally invasive surgical techniques that reduce postoperative symptoms and complications and enhance patients’ quality of life. However, to ensure excellent surgical outcomes after such interventions, intimate familiarity with important landmarks and high-level endoscope manipulation skills are essential. Cadaver training is one possible option, but cadavers are expensive, scarce, and nonreusable and cadaver work requires specialized equipment and staff. In addition, it is difficult to mimic specific diseases using cadavers. Virtual reality simulators can create a computerized environment in which the patient’s anatomy is reproduced and interaction with endoscopic handling and realistic haptic feedback is possible. Moreover, they can be used to present scenarios that improve trainees’ skills and confidence. Therefore, virtual simulator training can be implemented at all levels of surgical education. This review introduces the current literature on virtual reality training for endoscopic sinus and skull base surgeons, and discusses the direction of future developments.
INTRODUCTION
In the time since endoscopes were initially used during paranasal sinus and skull base surgery, the indications for endoscopic surgery have been extended; minimally invasive surgical techniques reduce postoperative symptoms and complications and enhance the quality of life of patients [1,2]. Compared to the traditional microscope, the endoscope provides a larger field of view when treating pituitary adenomas and skull base conditions. However, to ensure excellent surgical outcomes after endoscopic endonasal intervention, intimate familiarity with important landmarks and high-level endoscope manipulation skills (including avoidance of instrument tangles in a narrow space) are essential. In particular, during skull base surgery, carotid artery or other vascular injuries can be debilitating or fatal, and cranial nerve injury can cause catastrophic disability. Although residents are trained to perform endoscopic surgery under close supervision, many critical procedures are still performed by the attending surgeon. Therefore, it is essential to improve the learning curve for various surgical conditions. Cadaver training is one possible option, but cadavers are expensive, scarce, and nonreusable and cadaver work requires specialized equipment and staff. In addition, it is difficult to mimic specific diseases using cadavers. Three-dimensional (3D) printed models and other alternatives have been developed, but are expensive and have limitations similar to those of cadaver training. Virtual reality can be used to present scenarios that improve trainee skills and confidence. Therefore, virtual simulator training can be implemented at all levels of surgical education. Here, we review the current literature on virtual reality training for endoscopic sinus and skull base surgeons, and discuss the directions of future developments.
VIRTUAL SIMULATORS OF ENDOSCOPIC SINUS AND SKULL BASE SURGERY
The available virtual simulators are listed in Table 1.
Endoscopic sinus surgery simulator
The endoscopic sinus surgery simulator (ES3) was developed in 1997 by Lockheed Martin (Bethesda, MD, USA) [3-5] to provide a virtual surgical environment in which a trainee manipulates an endoscope and other instruments inside the nasal cavity of a mannequin. All tools afford touch feedback. The simulator features three levels of difficulty, allowing effective step-by-step practice. The entire performance is recorded and scored. A penalty is applied when the trainee exceeds a time limit or damages important anatomical structures such as the optic nerve. Although there have been many validation studies [6-15], development costs have rendered the simulator very expensive, and it is no longer available.
Nasal endoscopy simulator
The nasal endoscopy simulator (NES), developed in Germany, was also introduced in 1997 [16,17] to simulate endoscopic sinus surgery. The user explores a model of the head using actual surgical instruments, and an electromagnetic tracking system is operative. Unlike the ES3, haptic feedback is absent. The NES tracks motion, detects collisions between instruments, and simulates tissue deformation. Errors (such as clashes of endoscopic tools or damage to crucial tissues) are recorded. This simulator also provides three levels of difficulty.
Dextroscope
The Dextroscope was developed in Singapore in 2003 [18,19]. The workstation features stereoscopic glasses, a stylus, and a control handle (a joystick). The use of 3D glasses and a mirror enable 3D interaction with patient-specific computed tomography (CT) and/or magnetic resonance imaging data. The Dextroscope does not use a model head. The joystick operates the endoscope and the stylus operates the microdebrider or drill. The 3D graphics reflect the anatomical shape; however, force feedback is lacking, as are physiological data such as injection blanching or bleeding. The model is rather stiff, preventing gentle displacement of certain structures.
Simulation of transsphenoidal endoscopic pituitary surgery
The VRVis Research Center and Department of Neurosurgery of the Medical University of Vienna collaborated to develop the virtual endoscopic simulation of transsphenoidal endoscopic pituitary surgery (STEPS) system between 2004 and 2009 [20-22]. This system trains surgeons in the preoperative planning and performance of transsphenoidal pituitary surgery. STEPS has been integrated into the Impax EE PACS system (Agfa HealthCare, Bonn, Germany). Preoperatively, the nasal cavity and skull base are reconstructed in a 3D virtual endoscopic environment. The crucial anatomical structures are displayed in the background behind semitransparent boundaries, and surface rigidity is color-coded. The user selects anatomical structures to be displayed, views them from various angles, modifies visual factors, defines landmarks, and removes tissues. Angled endoscopes and other simulated instruments such as rongeurs are used to replicate the transsphenoidal approach. To prevent instrument manipulations that are impossible in the real world, instrument collisions are detected and force feedback is delivered. In 2010, STEPS was upgraded using the Stealth Station image-guided navigation system (Medtronic, Minneapolis, MS, USA) [23]. Currently, the endoscope and additional instruments are optically tracked during interventions, and their positions are passed to STEPS, which generate virtual images.
CardinalSim
The CardinalSim, which simulates endoscopic sinus surgery, was developed in 2009 [24]. Patient-specific endonasal anatomy is reconstructed from CT data, allowing preoperative virtual flythrough. A model of an actual endoscope attached to a haptic device is available. The user can explore the virtual endonasal anatomy and using a virtual microdebrider, remove tissue during simulated surgery. Force feedback creates resistance as the tip of the instrument contacts the reconstructed anatomy. A recent report claimed that construction of a usable model from a CT scan required 1–2 hours depending on the number and complexity of segmented structures [25]. Preoperative surgical rehearsal of sinus and skull base surgery can be conducted using standard PC hardware and a commercial haptic device.
VOXEL-MAN SinuSurg
The VOXEL-MAN SinuSurg was developed by a German research group in 2010 [26]. The basic layout is that of the VOXEL-MAN TempoSurg simulator of temporal bone surgery [27]. The simulator runs on standard PC hardware and features the “Phantom Omni” haptic device (SensAble Technologies, Woburn, MA, USA). Customized algorithms allow subvoxel visualization, volume cutting, and haptic rendering.
Flinders sinus surgery simulator
The Flinders sinus surgery simulator was developed in 2013 and features two haptic manipulators [28-30]. The “Novint Falcon” (Novint Technologies, Delaware, USA) senses the 3D position and provides linear haptic feedback. A controllable 0° endoscope is attached. The Phantom Omni haptic device (SensAble Technologies) senses both the 3D position and orientation, also providing linear haptic feedback to control surgical instruments. Collision detection and force feedback are available; the textures of mucosa and deformable tissues are rendered realistic using voxel- and triangle-based surface mesh models. Blinn-Phong lighting [31] is used to shade flat featureless areas. The action of vasoconstrictive drugs is simulated to increase the space available for endoscopic instrumentation.
NeuroTouch Endo
Engineers and neurosurgeons of the National Research Council of Canada collaborated to develop the NeuroTouch neurosurgical simulator (Fig. 1) [32,33]. NeuroTouch offers various training scenarios based on real cases; transsphenoidal, endonasal pituitary surgery is included. The first version allows the user to practice endoscopic manipulation and detect the sphenoid ostium. The user inserts a virtual endoscope affording haptic feedback into the nostril, and explores the simulated endonasal anatomy. When the instrument contacts tissue, deformation is computed based on real-time physical feedback. When the tip of the endoscope touches the mucosa, the lens can be stained, blurring the view; forceful contact can trigger bleeding, obstructing the view. Stains or blood can be rinsed by pressing a pedal. Important anatomical landmarks, such as the septum and ostium, can be labeled and the labels become visible as the endoscope approaches the structures. The user can toggle between an interior and exterior view of the nasal cavity, which is particularly useful for initially inserting the endoscope into the nostril. During simulation, the trainee advances the endoscope to the sphenoid ostium, opens the ostium, and then enters the other nostril to find the opposite ostium. When the trainee reaches the first ostium, a “mission completed” message appears. The simulation ends when both sphenoid ostia are found, and performance is scored by reference to the number of targets attained, the force used, and the time taken.
McGill simulator for endoscopic sinus surgery
The McGill simulator was developed by McGill University and the National Research Council of Canada [34]. The simulator features the NeuroTouch platform upgraded to a bimanual system; the trainee uses both an endoscope and other instruments as in real life. The hardware was improved, a mannequin head was added, the distance between the handle and the action axis of the haptic device was shortened, and additional haptic devices were inserted in the head using shafts [35]. However, insertion of two haptic devices (the endoscope and an instrument) in the ipsilateral nostril is impossible, because the action axis of the Phantom Omni haptic device is too thick.
DISCUSSION
Surgical simulators play important roles in the preoperative assessment of patient anatomy, reducing the risk of injury to anatomical structures near the surgical target, and helping novice surgeons overcome the steep learning curve associated with endoscopic procedures. Several virtual simulators of endoscopic sinus and skull base surgery are available. Future improvements will render cadaver training irrelevant.
Patient-specific algorithms for preoperative planning are needed. Usually, an anatomical model is reconstructed from radiological images. Such images are often inaccurate, even when the slices used for reconstruction are very thin. To correctly depict structures of interest, it is often necessary to resort to manual segmentation, which is time-consuming and tedious. Furthermore, the final visual result depends on the so-called transfer function, which maps the intensities of the radiological source materials to colors and levels of visibility. Selection of an inappropriate transfer function causes structures to be missed or displayed incorrectly. Therefore, software resolving these issues is essential and efforts are underway.
Haptic devices improving real-life surgical skills are required. Many simulators feature commercial haptic devices. However, freely moving devices do not adequately improve skills and confidence. Laparoscope haptic devices are being vigorously developed. A laparoscope enters via a “port”; several mutually distant ports may be used. If an instrument port is available, haptic delivery and tracking are simple. However, the nostril is a hole, not a port. In addition, the endoscope and instruments are inserted into the ipsilateral nostril during sinus and skull base surgery; the three instruments must therefore operate very closely together during endoscopic skull base surgery using the two-nostrils/four-hands technique (Fig. 2) [36]. Therefore, the development of haptic devices is very challenging. However, training on how an endoscope and instruments interact in the nostril is very important, particularly when the endoscope is angled and the instruments are curved. The McGill simulator for endoscopic sinus surgery haptic devices improve constantly. However, one nostril-two instrument products (thin light instruments) affording realistic microsurgery feedback are essential.
Finally, an assessment system allowing novice surgeons to climb the learning curve is required. Validation testing of, and improvements in, software and hardware are ongoing, but the work is time-consuming and expensive. ES3 has been extensively validated and assessed. To balance cost-effectiveness and quality control, software and hardware that can be readily upgraded, and an interactive interface such as a Content Service Platform, are needed. A Content Service Platform is a service platform for a simulation system that is linked to an online network. It enables management of the contents, assessment data, and account information of the users and running of a coaching community.
The points noted above form the general goal of virtual reality simulators. However, depending on the particular simulator’s objective, the main points of focus of the simulators are different. Surgical rehearsal programs should be more focused on the fast and realistic conversion of images. On the other hand, surgical training simulators should be more focused on scenario and assessment systems.
Several virtual reality simulators mimicking real surgical settings are available and have improved over time. However, there is a greater requirement for optimized rendering software, custom haptic devices for endonasal surgery, and appropriate trainee assessment. Currently, cadaver training is optimal, but is costly and difficult to access. In the future, virtual reality simulators will render cadaver training obsolete.
HIGHLIGHTS
▪ There is a learning curve associated with endoscopic sinus and skull base surgery.
▪ Virtual reality simulators can be an effective training tool for novice surgeons in reducing the learning curve.
▪ Optimized software, custom-made devices, and proper assessment are required in this field.
Notes
No potential conflict of interest relevant to this article was reported.
Acknowledgements
This work was supported by Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Education (2017R1D1A1B03027903), and the Bio & Medical Technology Development Program of the NRF funded by the Ministry of Science and ICT (2018M3A9E80208 56), and the Korea Health Industry Development Institute funded by the Ministry of Health and Welfare (HI14C3228, HI15C 0133), Republic of Korea.