- •Contents
- •1.1. Introduction to the Eye
- •1.2. The Anatomy of the Human Visual System
- •1.3. Neurons
- •1.4. Synapses
- •1.5. Vision — Sensory Transduction
- •1.6. Retinal Processing
- •1.7. Visual Processing in the Brain
- •1.8. Biological Vision and Computer Vision Algorithms
- •References
- •2.1. Introduction to Computational Methods for Feature Detection
- •2.2. Preprocessing Methods for Retinal Images
- •2.2.1. Illumination Effect Reduction
- •2.2.1.1. Non-linear brightness transform
- •2.2.2. Image Normalization and Enhancement
- •2.2.2.1. Color channel transformations
- •2.2.2.3. Local adaptive contrast enhancement
- •2.2.2.4. Histogram transformations
- •2.3. Segmentation Methods for Retinal Anatomy Detection and Localization
- •2.3.1. A Boundary Detection Methods
- •2.3.1.1. First-order difference operators
- •2.3.1.2. Second-order boundary detection
- •2.3.1.3. Canny edge detection
- •2.3.2. Edge Linkage Methods for Boundary Detection
- •2.3.2.1. Local neighborhood gradient thresholding
- •2.3.2.2. Morphological operations for edge link enhancement
- •2.3.2.3. Hough transform for edge linking
- •2.3.3. Thresholding for Image Segmentation
- •2.3.3.1. Segmentation with a single threshold
- •2.3.3.2. Multi-level thresholding
- •2.3.3.3. Windowed thresholding
- •2.3.4. Region-Based Methods for Image Segmentation
- •2.3.4.1. Region growing
- •2.3.4.2. Watershed segmentation
- •2.4.1. Statistical Features
- •2.4.1.1. Geometric descriptors
- •2.4.1.2. Texture features
- •2.4.1.3. Invariant moments
- •2.4.2. Data Transformations
- •2.4.2.1. Fourier descriptors
- •2.4.2.2. Principal component analysis (PCA)
- •2.4.3. Multiscale Features
- •2.4.3.1. Wavelet transform
- •2.4.3.2. Scale-space methods for feature extraction
- •2.5. Summary
- •References
- •3.1.1. EBM Process
- •3.1.2. Evidence-Based Medical Issues
- •3.1.3. Value-Based Evidence
- •3.2.1. Economic Evaluation
- •3.2.2. Decision Analysis Method
- •3.2.3. Advantages of Decision Analysis
- •3.2.4. Perspective in Decision Analysis
- •3.2.5. Decision Tree in Decision Analysis
- •3.3. Use of Information Technologies for Diagnosis in Ophthalmology
- •3.3.1. Data Mining in Ophthalmology
- •3.3.2. Graphical User Interface
- •3.4. Role of Computational System in Curing Disease of an Eye
- •3.4.1. Computational Decision Support System: Diabetic Retinopathy
- •3.4.1.1. Wavelet-based neural network23
- •3.4.1.2. Content-based image retrieval
- •3.4.2. Computational Decision Support System: Cataracts
- •3.4.2.2. K nearest neighbors
- •3.4.2.3. GUI of the system
- •3.4.3. Computational Decision Support System: Glaucoma
- •3.4.3.1. Using fuzzy logic
- •3.4.4. Computational Decision Support System: Blepharitis, Rosacea, Sjögren, and Dry Eyes
- •3.4.4.1. Utility of bleb imaging with anterior segment OCT in clinical decision making
- •3.4.4.2. Computational decision support system: RD
- •3.4.4.3. Role of computational system
- •3.4.5. Computational Decision Support System: Amblyopia
- •3.4.5.1. Role of computational decision support system in amblyopia
- •3.5. Conclusion
- •References
- •4.1. Introduction to Oxygen in the Retina
- •4.1.1. Microelectrode Methods
- •4.1.2. Phosphorescence Dye Method
- •4.1.3. Spectrographic Method
- •4.1.6. HSI Method
- •4.2. Experiment One
- •4.2.1. Methods and Materials
- •4.2.1.1. Animals
- •4.2.1.2. Systemic oxygen saturation
- •4.2.1.3. Intraocular pressure
- •4.2.1.4. Fundus camera
- •4.2.1.5. Hyperspectral imaging
- •4.2.1.6. Extraction of spectral curves
- •4.2.1.7. Mapping relative oxygen saturation
- •4.2.1.8. Relative saturation indices (RSIs)
- •4.2.2. Results
- •4.2.2.1. Spectral signatures
- •4.2.2.2. Oxygen breathing
- •4.2.2.3. Intraocular pressure
- •4.2.2.4. Responses to oxygen breathing
- •4.2.2.5. Responses to high IOP
- •4.2.3. Discussion
- •4.2.3.1. Pure oxygen breathing experiment
- •4.2.3.2. IOP perturbation experiment
- •4.2.3.3. Hyperspectral imaging
- •4.3. Experiment Two
- •4.3.1. Methods and Materials
- •4.3.1.1. Animals, anesthesia, blood pressure, and IOP perturbation
- •4.3.1.3. Spectral determinant of percentage oxygen saturation
- •4.3.1.5. Preparation and calibration of red blood cell suspensions
- •4.3.2. Results
- •4.3.2.2. Oxygen saturation of the ONH
- •4.3.3. Discussion
- •4.3.4. Conclusions
- •4.4. Experiment Three
- •4.4.1. Methods and Materials
- •4.4.1.1. Compliance testing
- •4.4.1.2. Hyperspectral imaging
- •4.4.1.3. Selection of ONH structures
- •4.4.1.4. Statistical methods
- •4.4.2. Results
- •4.4.2.1. Compliance testing
- •4.4.2.2. Blood spectra from ONH structures
- •4.4.2.3. Oxygen saturation of ONH structures
- •4.4.2.4. Oxygen saturation maps
- •4.4.3. Discussion
- •4.5. Experiment Four
- •4.5.1. Methods and Materials
- •4.5.2. Results
- •4.5.3. Discussion
- •4.6. Experiment Five
- •4.6.1. Methods and Materials
- •4.6.1.3. Automatic control point detection
- •4.6.1.4. Fused image optimization
- •4.7. Conclusion
- •References
- •5.1. Introduction to Thermography
- •5.2. Data Acquisition
- •5.3. Methods
- •5.3.1. Snake and GVF
- •5.3.2. Target Tracing Function and Genetic Algorithm
- •5.3.3. Locating Cornea
- •5.4. Results
- •5.5. Discussion
- •5.6. Conclusion
- •References
- •6.1. Introduction to Glaucoma
- •6.1.1. Glaucoma Types
- •6.1.1.1. Primary open-angle glaucoma
- •6.1.1.2. Angle-closure glaucoma
- •6.1.2. Diagnosis of Glaucoma
- •6.2. Materials and Methods
- •6.2.1. c/d Ratio
- •6.2.2. Measuring the Area of Blood Vessels
- •6.2.3. Measuring the ISNT Ratio
- •6.3. Results
- •6.4. Discussion
- •6.5. Conclusion
- •References
- •7.1. Introduction to Temperature Distribution
- •7.3. Mathematical Model
- •7.3.1. The Human Eye
- •7.3.2. The Eye Tumor
- •7.3.3. Governing Equations
- •7.3.4. Boundary Conditions
- •7.4. Material Properties
- •7.5. Numerical Scheme
- •7.5.1. Integro-Differential Equations
- •7.6. Results
- •7.6.1. Numerical Model
- •7.6.2. Case 1
- •7.6.3. Case 2
- •7.6.4. Discussion
- •7.7. Parametric Optimization
- •7.7.1. Analysis of Variance
- •7.7.2. Taguchi Method
- •7.7.3. Discussion
- •7.8. Concluding Remarks
- •References
- •8.1. Introduction to IR Thermography
- •8.2. Infrared Thermography and the Measured OST
- •8.3. The Acquisition of OST
- •8.3.1. Manual Measures
- •8.3.2. Semi-Automated and Fully Automated
- •8.4. Applications to Ocular Studies
- •8.4.1. On Ocular Physiologies
- •8.4.2. On Ocular Diseases and Surgery
- •8.5. Discussion
- •References
- •9.1. Introduction
- •9.1.1. Preprocessing
- •9.1.1.1. Shade correction
- •9.1.1.2. Hough transform
- •9.1.1.3. Top-hat transform
- •9.1.2. Image Segmentation
- •9.1.2.1. The region approach
- •9.1.2.2. The gradient-based method
- •9.1.2.3. Edge detection
- •9.1.2.3.2. The second-order derivative methods
- •9.1.2.3.3. The optimal edge detector
- •9.2. Image Registration
- •9.4. Automated, Integrated Image Analysis Systems
- •9.5. Conclusion
- •References
- •10.1. Introduction to Diabetic Retinopathy
- •10.2. Data Acquisition
- •10.3. Feature Extraction
- •10.3.1. Blood Vessel Detection
- •10.3.2. Exudates Detection
- •10.3.3. Hemorrhages Detection
- •10.3.4. Contrast
- •10.4.1. Backpropagation Algorithm
- •10.5. Results
- •10.6. Discussion
- •10.7. Conclusion
- •References
- •11.1. Related Studies
- •11.2.1. Encryption
- •11.3. Compression Technique
- •11.3.1. Huffman Coding
- •11.4. Error Control Coding
- •11.4.1. Hamming Codes
- •11.4.2. BCH Codes
- •11.4.3. Convolutional Codes
- •11.4.4. RS Codes14
- •11.4.5. Turbo Codes14
- •11.5. Results
- •11.5.1. Using Turbo Codes for Transmission of Retinal Fundus Image
- •11.6. Discussion
- •11.7. Conclusion
- •References
- •12.1. Introduction to Laser-Thermokeratoplasty (LTKP)
- •12.2. Characteristics of LTKP
- •12.3. Pulsed Laser
- •12.4. Continuous-Wave Laser
- •12.5. Mathematical Model
- •12.5.1. Model Description
- •12.5.2. Governing Equations
- •12.5.3. Initial-Boundary Conditions
- •12.6. Numerical Scheme
- •12.6.1. Integro-Differential Equation
- •12.7. Results
- •12.7.1. Pulsed Laser
- •12.7.2. Continuous-Wave Laser
- •12.7.3. Thermal Damage Assessment
- •12.8. Discussion
- •12.9. Concluding Remarks
- •References
- •13.1. Introduction to Optical Eye Modeling
- •13.1.1. Ocular Measurements for Optical Eye Modeling
- •13.1.1.1. Curvature, dimension, thickness, or distance parameters of ocular elements
- •13.1.1.2. Three-dimensional (3D) corneal topography
- •13.1.1.3. Crystalline lens parameters
- •13.1.1.4. Refractive index
- •13.1.1.5. Wavefront aberration
- •13.1.2. Eye Modeling Using Contemporary Optical Design Software
- •13.1.3. Optical Optimization and Merit Function
- •13.2. Personalized and Population-Based Eye Modeling
- •13.2.1. Customized Eye Modeling
- •13.2.1.1. Optimization to the refractive error
- •13.2.1.2. Optimization to the wavefront measurement
- •13.2.1.3. Tolerance analysis
- •13.2.2. Population-Based Eye Modeling
- •13.2.2.1. Accommodative eye modeling
- •13.2.2.2. Ametropic eye modeling
- •13.2.2.3. Modeling with consideration of ocular growth and aging
- •13.2.2.4. Modeling for disease development
- •13.2.3. Validation of Eye Models
- •13.2.3.1. Point spread function and modulation transfer function
- •13.2.3.2. Letter chart simulation
- •13.2.3.3. Night/day vision simulation
- •13.3. Other Modeling Considerations
- •13.3.1. Stiles Crawford Effect (SCE)
- •13.3.1.2. Other retinal properties
- •13.3.1.4. Optical opacity
- •13.4. Examples of Ophthalmic Simulations
- •13.4.1. Simulation of Retinoscopy Measurements with Eye Models
- •13.4.2. Simulation of PR
- •13.5. Conclusion
- •References
- •14.1. Network Infrastructure
- •14.1.1. System Requirements
- •14.1.2. Network Architecture Design
- •14.1.4. GUI Design
- •14.1.5. Performance Evaluation of the Network
- •14.2. Image Analysis
- •14.2.1. Vascular Tree Segmentation
- •14.2.2. Quality Assessment
- •14.2.3. ON Detection
- •14.2.4. Macula Localization
- •14.2.5. Lesion Segmentation
- •14.2.7. Patient Demographics and Statistical Outcomes
- •14.2.8. Disease State Assessment
- •14.2.9. Image QA
- •Acknowledgments
- •References
- •Index
Automatic Diagnosis of Glaucoma Using Digital Fundus Images
One application of GMM is to form smooth approximations for arbitrarily shaped densities. GMM provides a useful tool to model the characteristics of multi-model distributed data. Another useful characteristic is the fact that GMM employs a diagonal covariance matrix that is less complex compared to full covariance matrixes that are usually required.18 This significantly reduces the computational complexity of the algorithm. GMM has been used in many areas such as pattern recognition and classification. In general, this method poses a great success in the areas of identification and verification.
6.3. Results
We start the result discussion by listing both mean and standard deviation of the computed features. Table 6.1 shows this list. The c/d ratio and the area of blood vessels are larger for glaucoma, due to the increase in pressure. This ratio is 0.343 ± 0.245 for a normal subject and 0.503 ± 0.221 for a glaucoma subject. The number of blood vessels for a normal subject is 29254.3 ± 10775.5. This number increases in glaucoma subjects (35746 ± 11443.2). The ISNT ratio is also greater for subjects suffering from glaucoma (1.037±0.021) than the normal subjects (1.024±0.02). We conducted a Student t-test on these two groups (normal subjects and glaucoma subjects) for different features, and the acquired p value was less than 0.03. The low p value indicates that these results are statistically significant.
Table 6.2 illustrates how many samples were used for training and testing the classifier. Furthermore, this table lists also the classification results. In this investigation, 42 images were used for training and the rest (18 images) were used for testing. During classification, only one normal sample was
Table 6.1. Values of three features for normal and glaucoma cases.
Features |
Normal |
Glaucoma |
p value |
||
|
|
|
|
|
|
c/d ratio |
0.343 |
± 0.245 |
0.503 |
± 0.221 |
0.01 |
Blood vessels |
29254.3 |
± 10775.5 |
35746 |
± 11443.2 |
0.03 |
ISNT ratio |
1.024 |
± 0.02 |
1.037 |
± 0.021 |
0.02 |
221
Rajendra Acharya U. et al.
Table 6.2. Classification results.
|
Number of data |
Number of |
Correctly |
Percentage |
Type of |
sets used for |
data sets used |
classified |
correctly |
image |
training |
for testing |
test data |
classified |
|
|
|
|
|
Normal |
21 |
9 |
8 |
88.89 |
Glaucoma |
21 |
9 |
7 |
77.78 |
Average |
|
|
|
83.33 |
|
|
|
|
|
Table 6.3. Sensitivity, specificity, and positive predictive values for the GMM classifier.
|
|
|
|
|
|
|
Positive predictive |
Classifier |
TN |
TP |
FP |
FN |
Sensitivity |
Specificity |
accuracy |
|
|
|
|
|
|
|
|
GMM |
8 |
7 |
1 |
2 |
77.78% |
88.89% |
87.5% |
|
|
|
|
|
|
|
|
classified as abnormal and two glaucoma images were classified as normal. The average classification rate was 83.33%.
Table 6.3 shows the sensitivity, specificity, and positive predictive accuracy for the two classes. In the table, we denote true positive (TP) for the number of glaucoma images classified as correctly as glaucoma, true negative (TN) for the number of normal images correctly identified as normal, false negative (FN) for the number of glaucoma samples misclassified as normal, and false positive (FP) for the number of normal image misclassified as glaucoma. Sensitivity is the probability of an abnormal subject being correctly classified as abnormal; specificity is the probability of a normal subject being correctly identified as normal by classifier. The proposed system detects glaucoma with a sensitivity of 77.78% and a specificity of 88.89%. Furthermore, the positive predictive value is 87.50%.
A graphical user interface (GUI) was developed in this work to enable a user to access the algorithm in a user-friendly manner, as illustrated in Fig. 6.12. It comprised input image data, output feature extraction data, a review of the patients’ last visit, patient data selection buttons and display,
222
Automatic Diagnosis of Glaucoma Using Digital Fundus Images
Fig. 6.12. The GUI.
and a classification textbox. Using the Patient Data button, the patient image file (Image 2) was loaded. Then, the original image, optic disc, optic cup, blood vessels, inferior blood vessels, superior blood vessels, nasal blood vessels, and temporal blood vessels were displayed. The patient details were also be displayed in the Patient Data section of the display. Patient ID, gender, age, attending physician, date of birth, race, date of scan, and previous visit were automatically displayed in the Patient Data section.
In addition, the right-hand corner of the display showed an earlier visit image. There is a provision provided to display optic disc and blood vessel images and the ISNT ratio.
6.4. Discussion
We have extracted three features to detect glaucoma automatically. Our features are clinically significant and can identify the disease with an accuracy of 83%. It is important to diagnose glaucoma in an early stage in order to minimize damage to the optic nerve. Only if this damage is minimal, the disease can be effectively treated and the progression of the disease can be prevented.
223
Rajendra Acharya U. et al.
Previously, six fuzzy classification algorithms were employed to detect the presence and the absence of glaucoma with a classification rate of less than 76%.10 An ANN was proposed to recognize glaucomatous visual field defects, and its diagnostic accuracy was compared with that of other algorithms.19 For this work, the Glaucoma Hemifield Test attained a sensitivity of 92% at 91% specificity. The ANN method itself achieved a sensitivity of 93% and a specificity of 94%. The area under the ROC curve was 0.984.
Bowd et al. used neural network techniques to differentiate glaucomatous and nonglaucomatous eyes. They extracted optic disc topography parameters from the Heidelberg retina tomograph.20 The areas under the ROC curves for SVM linear and SVM Gaussian were 0.938 and 0.945, respectively, for the MLP, the ROC area was 0.941, and for the LDF, the ROC area was 0.906. With the use of forward selection and backward elimination optimization techniques, the areas under the ROC curves for SVM Gaussian and the current LDF were increased to approximately 0.96. Hence, the neural network analyses show an increasing diagnostic accuracy of tests for glaucoma.
Recently, Nayak et al. have used a novel method for glaucoma detection using the c/d ratio, the ratio of the distance between optic disc center, the optic nerve head to diameter of the optic disc, and the ratio of blood vessels area in inferior-superior quadrants to the area of blood vessel in the nasal-temporal quadrants.14 The resulting feature vector was fed to a neural network for classification. Their proposed system classified the glaucoma automatically with a sensitivity and specificity of 100% and 80%, respectively.
Our results show a sensitivity of 77.7% and a specificity of 88.8% for the proposed system. We predict that the accuracy of the proposed classification system can be improved by using more parameters, such as textures. In addition, by increasing the number of training and testing images, the result can be further improved. The environmental lighting condition plays an important role in the determination of the classifier performance. Uniform lighting condition set while acquiring a fundus image can also yield better results. This method can serve as an adjunct tool to aid a physician in crosschecking his or her diagnosis.
224
