- •Contents
- •Introduction
- •Contributors
- •ROLE OF BIOPSY
- •DIRECTED TREATMENTS OF DISTINCT ORBITAL INFLAMMATIONS
- •ABSTRACT
- •ACKNOWLEDGEMENTS
- •5 Future and Emerging Treatments for Microbial Infections
- •MICROBIOLOGIC DIAGNOSIS
- •EMERGING ANTIBIOTIC RESISTANCE
- •HISTORICAL PERSPECTIVE
- •CURRENT APPROACH
- •FUTURE DIRECTIONS
- •7 Non-Hodgkin’s Lymphoma
- •INCIDENCE AND EPIDEMIOLOGY
- •ETIOLOGY AND RISK FACTORS
- •DIAGNOSIS, CLASSIFICATION, AND STAGING
- •TREATMENT
- •ABSTRACT
- •INTRODUCTION
- •STEPS TOWARD TUMOR SPECIFIC THERAPY
- •CANCER SPECIFIC MOLECULAR TARGETS
- •DNA ARRAY ANALYSIS
- •WHICH MOLECULAR TARGETS?
- •CONCLUSIONS
- •10 Malignant Lacrimal Gland Tumors
- •THERAPEUTIC RECOMMENDATIONS
- •SPHENOID WING MENINGIOMAS
- •Location
- •PRESENTING SIGNS AND SYMPTOMS
- •RADIOGRAPHIC IMAGING
- •ULTRASOUND
- •HISTOPATHOLOGY
- •TREATMENT AND PROGNOSIS
- •13 Stereotactic Radiotherapy for Optic Nerve and Meningeal Lesions
- •BACKGROUND
- •DEFINITIONS
- •Precise Immobilization
- •Precise Tumor Localization
- •Conformal Treatment Planning and Delivery
- •FUTURE DEVELOPMENTS
- •SUMMARY
- •ABSTRACT
- •INTRODUCTION
- •ABSTRACT
- •INTRODUCTION
- •Enzyme-Linked Immunosorbent Assay (ELISA)
- •Prospective Study of Graves’ Disease Patients
- •DISCUSSION
- •ACKNOWLEDGEMENTS
- •ORBITAL FIBROBLASTS DISPLAY CELL-SURFACE CD40 AND RESPOND TO CD154
- •CONCLUSIONS
- •ACKNOWLEDGEMENTS
- •INTRODUCTION
- •Retina, RPE, and Choroid
- •Optic Nerve
- •ACKNOWLEDGMENT
- •INTRODUCTION
- •METHODS
- •Historical Features
- •Tempo of Disease Onset
- •Clinical Features
- •DISCUSSION
- •19 Prognostic Factors
- •PREVENTION OF GRAVES’ OPHTHALMOPATHY BY EARLIER DIAGNOSIS AND TREATMENT OF GRAVES’ HYPERTHYROIDISM?
- •CLINICAL ACTIVITY SCORE
- •ORBITAL ECHOGRAPHY
- •ORBITAL OCTREOSCAN
- •ORBITAL MAGNETIC RESONANCE IMAGING
- •URINARY GLYCOSAMINOGLYCANS
- •SERUM CYTOKINES
- •CONCLUSION
- •BACKGROUND
- •VISA CLASSIFICATION
- •Strabismus
- •Appearance=Exposure
- •DISCUSSION
- •INTRODUCTION
- •NONSEVERE GRAVES’ OPHTHALMOPATHY
- •SEVERE GRAVES’ OPHTHALMOPATHY
- •Glucocorticoids
- •Orbital Radiotherapy
- •Immunosuppressive Drugs
- •Plasmapheresis
- •Somatostatin Analogues
- •Intravenous Immunoglobulins
- •Antioxidants
- •Cytokine Antagonists
- •Colchicine
- •INTRODUCTION
- •STABLE ORBITOPATHY
- •Preferred Decompression Techniques
- •EYE MUSCLE SURGERY
- •LID PROCEDURES
- •PATHOPHYSIOLOGY OF THE DISEASE
- •MEDICAL THERAPY
- •IMPROVEMENTS IN ORBITAL DECOMPRESSION
- •IMPROVEMENTS IN EYELID SURGERY
- •STRABISMUS SURGERY
- •Michael Kazim
- •John Kennerdell
- •Daphne Khoo
- •Claudio Marcocci
- •Jack Rootman
- •Wilmar Wiersinga
- •Answer
- •Question 1 (continued)
- •Answer
- •Question 2 (from M. Potts)
- •Answer
- •Question 2 (continued)
- •Question 3
- •Answer
- •Question 3 (continued)
- •Answer
- •Question 3 (continued)
- •Answer
- •Question 3 (continued)
- •Answer
- •Question 4 (from M. Mourits)
- •Answer
- •Question 5 (from F. Buffam)
- •Answer
- •Question 6 (from F. Buffam)
- •Answer
- •Question 7 (from P. Dolman)
- •Answer
- •INTRODUCTION
- •CLINICAL MANIFESTATIONS OF DVVMs
- •INVESTIGATION OF DVVMs
- •FUTURE CONSIDERATIONS
- •CONCLUSION
- •INTRODUCTION
- •CAROTID-CAVERNOUS SINUS FISTULAS
- •ARTERIOVENOUS MALFORMATIONS
- •DISTENSIBLE VENOUS ANOMALIES
- •PREOPERATIVE EMBOLIZATION OF TUMORS
- •ANEURYSMS
- •FUTURE DIRECTIONS
- •ABSTRACT
- •INTRODUCTION
- •TECHNOLOGICAL ADVANCEMENTS
- •Advances in Medical Imaging
- •Virtual Reality Surgical Simulation
- •Surgical Robotics
- •HUMAN BODY MODELS
- •FUTURE COMPUTER-AIDED ORBITAL SURGERY
- •SUMMARY
- •ACKNOWLEDGMENTS
- •30 The Future of Orbital Surgery
- •Index
358 |
Nowinski |
Surgical procedures involve 3D stereoscopic perception and 3D manipulation with six degrees of freedom (i.e., three translations and three rotations). To fully exploit the potential of (multimodal) medical images in computer-aided surgery, these images should be perceived stereoscopically and explore using 3D interaction.
There are several technologies for achieving 3D perception in surgical simulation, including stereoscopic glasses, headmounted display (HMD), digital holography, virtual retinal display, and virtual reality projectors (9). Stereoscopic glasses containing shutters over each eye are one of the most frequently used. The glasses are synchronized to a computer monitor, which is able to generate leftand right-eye images. The HMD is a visual display system worn on the user’s head usually as goggles. Limitations of the HMD include size and weight producing encumbrance and fatigue, limited picture resolution, stereopsis, and the time lag and smoothness of the servo-mechanism that drives the remote camera to follow head movements.
To achieve 3D interaction with stereoscopic multimodal images, we have developed a device called Dextroscope (10,11). It consists of a mirror-based display, 3D input devices, and graphics workstation. A stereo virtual image is seen reflected in a mirror allowing the user wearing shuttered glasses to reach into the virtual space below this mirror. The Dextroscope provides compatibility of the physical space and the visual virtual space, so that physical tools manipulate virtual objects in a hand–eye coordinated manner.
Virtual Reality Surgical Simulation
The term virtual reality (VR) refers to a human–computer interface simulating realistic environment while enabling user interaction. Surgeon–computer interaction may be multimodal including speech, touch, and gesture recognition and synthesis. The surgeon may also see the real world with computergenerated images superimposed on the natural viewing field, so-called augmented reality. Augmented reality systems allow the surgeon to have real structures and anatomy as visual
The Future of Imaging in Orbital Disease |
359 |
landmarks, while supplementing them with useful clinical information such as presurgical scans and human body models.
Advances in more realistic physical models, ultrahighresolution displays, intuitive two-handed 3D interaction, rapid and accurate mapping of the human body models into patientspecific data, and more sensitive haptic devices increase the usefulness of virtual and augmented reality in clinical practice.
The successful use of flight simulators has inspired their application to surgical training. A virtual reality surgical simulator typically consists of five components, human body models (virtual patient), physical modeler, pathology modeler, virtual surgical instruments, and VR interface. Before performing a surgical procedure on a real patient, the surgeon will be able to practice and plan on a high-fidelity computer model of patient-specific data to simulate the intervention. In this way, the safest and most effective surgical approach can be planned and evaluated requiring less time in an operating room and improving human performance. Careful planning will minimize the amount of surgical trauma and maximize the outcome to the patient (2).
Computer-based surgical simulation is useful not only for preoperative planning but also for education, training, and skill assessment. Requirements for surgical simulator include realism to accurately represent the detailed shape of patient’s organs, real-time interaction, quantitative deformation, haptic feedback, and simulation of various surgical operations such as drilling (including vibration and sound), cutting, grasping, sucking, pushing, pinching, picking, suturing, vaporizing, coagulating, clipping, and knot tying.
The sense of touch is an integral part of surgery and simulating it is an essential component of the surgical simulator. Forces and torques are produced via a haptic device. For this purpose, knowledge of the characteristics of the simulated living tissues is required (12). When an object is palpated with a haptic device, several components influence its feel including stiffness, damping, and static and dynamic friction. The quality of feel is determined both by the realism of the biomedical model and human perception.
360 |
Nowinski |
Surgical Robotics
Robotic systems, introduced to surgery from the early 1980s, are able to increase the accuracy and dexterity of surgeon, reduce the tremor of human hand, and amplify or reduce the movements and=or forces applied by the surgeon (13).
The scale of operations in today’s surgical practice is becoming so small that even skilled surgeons are reaching the limits of their dexterity. The accuracy of manual intervention is limited by physiological tremor at the instrument tip, which can be as large as 50 mm peak to peak at the hand-held instrument tip. On the other hand, vitreoretinal microsurgery, for instance, involves removing membranes as thin as 20 mm from the retina. In addition, new treatments, such as cell implants, will require accuracy of 10 mm or better, which can only be achieved with robotic teleoperators.
In microsurgery, the robot scales small motions and forces to the optimal range of human perception, enabling improved performance and new microsurgical procedures. Several robotics systems have been developed to support microsurgery. A six-degree-of-freedom manipulator for vascular microsurgery in the retina has been described by Jensen et al. (14), and a teleoperated microsurgical robot for eye surgery is presented by Hunter et al. (15). A telemanipulator, robot assisted microsurgery (RAMS) (14), scales down the surgeon’s hand motion and filters tremor. RAMS assists surgeons in manipulating surgical instruments more precisely than it is possible manually. In addition, forces sensed at the surgical instrument can be amplified. Urban et al. (17) have reported a system for microsurgery with kinesthetic feedback. Using a hexapod robot produces a repeatable positioning accuracy of better than 2 mm. The system scales movements and senses forces at instruments. The surgeon controls the robot from the operating cockpit. Taylor et al. (18) describe a steady-hand robotic system for microsurgical augmentation that provides smooth, tremor-free precise positional control, and force scaling.
There are numerous other technologies that enhance computer-aided surgery, including sensing, miniaturization,
The Future of Imaging in Orbital Disease |
361 |
intelligent instruments, smartand biomaterials, MEMS, optics, and wireless communication.
HUMAN BODY MODELS
Deformable body models along with warping techniques provide means for analysis of medical images. Examples of practically useful deformable models are electronic brain atlases (19,20) that are applicable to several areas including functional neurosurgery (21–23), human brain mapping (24), neuroradiology (20), and neuroeducation (25).
The material used for building a human body model may originate from various sources, such as radiological images or photographs of cadaveric sections. Alternatively, a model can be based on human and=or animal studies or derived from mathematical formulas, and subsequently constructed by using computer-aided design (CAD) tools.
The construction of electronic orbital anatomy models is in its infancy despite the fact that critical neurovascular structures are limited to a 30-cm3 space only. Preliminary efforts typically do not use in vivo data but the modeled structures are rather approximated by simplified geometrical primitives like spheres and arcs (7,26), generated from mathematical models (15,27), based on literature studies (28), or derived from cadaveric sections (8). The ophthalmic model described by Parshall (7) has two components, the eyeball and orbit. The eyeball model was obtained from statistical norms based on the assumption that the eye had radially symmetrical outer walls. Nonsymmetrical components were then obtained by using CAD tools. To capture the irregular structures of the orbit, images of frozen cadaveric orbital sections were manually digitized, segmented, and registered. Both models were then manually registered to produce the final eye model. The model described by Hunter et al. (15) and Sagar et al. (27) is constructed from a parametric representation that can be displayed with varying degrees of detail. The corneal surface is obtained from confocal laser microscopy and the retinal vasculature from a fractal tree. In addition, a finite element method is used to
362 |
Nowinski |
produce a physical model suitable for telesurgical simulation. A realistic and accurate simulation of physical properties of the eye is a complicated process that requires the use of a supercomputer (29).
Our initial work, to develop a deformable eye model for ophthalmic surgery planning, uses multimodal data (8). The model of the cornea is obtained from Scheimpflug images and some of the external orbital structures, including the rectus muscles, optic nerve and sclera, are derived from the segmented cadaveric images of the Visible Human Female Data (Fig. 1b). Another model constructed in our laboratory contains the oculomotor nuclear complex and fascicles (28). This model facilitates an understanding of spatial relationships in the region and hypothesizes on the possible concentration of neurons involved in convergence. The model is based on the published studies in nonhuman primates and patients, and the 3D structures are constructed using CAD tools (Fig. 1c).
PRESENT COMPUTER-AIDED SURGERY AND
SIMULATION
Simulators developed for ophthalmic surgery are able to support various procedures such as retinal coagulation (30), cataract surgery (27,31), and vitrectomy (32,33). SOPHOCLE is a training simulator for retinal photocoagulation (17). Its objective is to teach laser retinal photocoagulation in different disorders using a virtual eye. The simulator uses a real slitlamp but the fundus and retinal photocoagulation impacts are virtual.
The eye surgery simulator (31) generates images of the eye and surgical instruments through the stereo-operating system and controls the position and orientation of the chosen instrument by moving the stylus. Four instruments are simulated, scalpel, forceps, scissors, and phacoemulsifier. This enables simulated cutting of the sclera and insertion of a phacoemulsifier to remove a cataract. During the instrument– tissue interactions, three feedback motors generate compo-
The Future of Imaging in Orbital Disease |
363 |
nent force feedback along three orthogonal axes. The procedures can be recorded for subsequent playback and analysis.
A virtual reality vitrectomy simulator (32) assists in correcting retinal detachment. The VR interface consists of a 3D mouse and stereo glasses but has no tactile feedback. The simulator contains a deformable eye model and the simulated instruments include pick, blade, suction, cutter, laser, and drainage needle.
EyeSi (33) simulates vitrectomy in a more realistic way. Developed for training and rehearsal, EyeSi contains a mechanical eye providing a tactile feedback. Two physical instruments are inserted into this physical eye. One instrument simulates a lamp and the other can serve as a picker, cutter, or vitrector. The movements of the physical instruments are tracked by cameras and their corresponding virtual movements within the computer eye model are displayed in a stereoscopic viewer that emulates the surgical microscope.
Much more effort has been spent in developing systems for brain surgery, and we will briefly illustrate state-of-the- art example systems developed in our laboratory for brain intervention. Several diseases can be treated endovascularly. The development of a computer simulator for endovascular intervention, however, is a challenging task. This type of simulator shall have standard generic features (discussed in the section on Virtual reality surgical simulation) as well as provide advanced methods for segmentation of vasculature, geometric modeling and meshing of vasculature, physical modeling of interventional devices and vasculature, hemodynamic analysis, and device–vessel interaction analysis. It also shall contain a large database of interventional devices. In addition, specific clinical procedures have to be simulated including angioplasty, stent placement, and aneurysm coiling. An example in Fig. 2a shows catheter navigation within a 3D cerebral model using our interventional neuroradiology simulator NeuroCath (34).
The endovascular simulator shall provide hybrid visualization, including simulated fluoroscopy, surface and volume rendering, and virtual endoscopy. The latter type of display
