- •Contents
- •1.1. Introduction to the Eye
- •1.2. The Anatomy of the Human Visual System
- •1.3. Neurons
- •1.4. Synapses
- •1.5. Vision — Sensory Transduction
- •1.6. Retinal Processing
- •1.7. Visual Processing in the Brain
- •1.8. Biological Vision and Computer Vision Algorithms
- •References
- •2.1. Introduction to Computational Methods for Feature Detection
- •2.2. Preprocessing Methods for Retinal Images
- •2.2.1. Illumination Effect Reduction
- •2.2.1.1. Non-linear brightness transform
- •2.2.2. Image Normalization and Enhancement
- •2.2.2.1. Color channel transformations
- •2.2.2.3. Local adaptive contrast enhancement
- •2.2.2.4. Histogram transformations
- •2.3. Segmentation Methods for Retinal Anatomy Detection and Localization
- •2.3.1. A Boundary Detection Methods
- •2.3.1.1. First-order difference operators
- •2.3.1.2. Second-order boundary detection
- •2.3.1.3. Canny edge detection
- •2.3.2. Edge Linkage Methods for Boundary Detection
- •2.3.2.1. Local neighborhood gradient thresholding
- •2.3.2.2. Morphological operations for edge link enhancement
- •2.3.2.3. Hough transform for edge linking
- •2.3.3. Thresholding for Image Segmentation
- •2.3.3.1. Segmentation with a single threshold
- •2.3.3.2. Multi-level thresholding
- •2.3.3.3. Windowed thresholding
- •2.3.4. Region-Based Methods for Image Segmentation
- •2.3.4.1. Region growing
- •2.3.4.2. Watershed segmentation
- •2.4.1. Statistical Features
- •2.4.1.1. Geometric descriptors
- •2.4.1.2. Texture features
- •2.4.1.3. Invariant moments
- •2.4.2. Data Transformations
- •2.4.2.1. Fourier descriptors
- •2.4.2.2. Principal component analysis (PCA)
- •2.4.3. Multiscale Features
- •2.4.3.1. Wavelet transform
- •2.4.3.2. Scale-space methods for feature extraction
- •2.5. Summary
- •References
- •3.1.1. EBM Process
- •3.1.2. Evidence-Based Medical Issues
- •3.1.3. Value-Based Evidence
- •3.2.1. Economic Evaluation
- •3.2.2. Decision Analysis Method
- •3.2.3. Advantages of Decision Analysis
- •3.2.4. Perspective in Decision Analysis
- •3.2.5. Decision Tree in Decision Analysis
- •3.3. Use of Information Technologies for Diagnosis in Ophthalmology
- •3.3.1. Data Mining in Ophthalmology
- •3.3.2. Graphical User Interface
- •3.4. Role of Computational System in Curing Disease of an Eye
- •3.4.1. Computational Decision Support System: Diabetic Retinopathy
- •3.4.1.1. Wavelet-based neural network23
- •3.4.1.2. Content-based image retrieval
- •3.4.2. Computational Decision Support System: Cataracts
- •3.4.2.2. K nearest neighbors
- •3.4.2.3. GUI of the system
- •3.4.3. Computational Decision Support System: Glaucoma
- •3.4.3.1. Using fuzzy logic
- •3.4.4. Computational Decision Support System: Blepharitis, Rosacea, Sjögren, and Dry Eyes
- •3.4.4.1. Utility of bleb imaging with anterior segment OCT in clinical decision making
- •3.4.4.2. Computational decision support system: RD
- •3.4.4.3. Role of computational system
- •3.4.5. Computational Decision Support System: Amblyopia
- •3.4.5.1. Role of computational decision support system in amblyopia
- •3.5. Conclusion
- •References
- •4.1. Introduction to Oxygen in the Retina
- •4.1.1. Microelectrode Methods
- •4.1.2. Phosphorescence Dye Method
- •4.1.3. Spectrographic Method
- •4.1.6. HSI Method
- •4.2. Experiment One
- •4.2.1. Methods and Materials
- •4.2.1.1. Animals
- •4.2.1.2. Systemic oxygen saturation
- •4.2.1.3. Intraocular pressure
- •4.2.1.4. Fundus camera
- •4.2.1.5. Hyperspectral imaging
- •4.2.1.6. Extraction of spectral curves
- •4.2.1.7. Mapping relative oxygen saturation
- •4.2.1.8. Relative saturation indices (RSIs)
- •4.2.2. Results
- •4.2.2.1. Spectral signatures
- •4.2.2.2. Oxygen breathing
- •4.2.2.3. Intraocular pressure
- •4.2.2.4. Responses to oxygen breathing
- •4.2.2.5. Responses to high IOP
- •4.2.3. Discussion
- •4.2.3.1. Pure oxygen breathing experiment
- •4.2.3.2. IOP perturbation experiment
- •4.2.3.3. Hyperspectral imaging
- •4.3. Experiment Two
- •4.3.1. Methods and Materials
- •4.3.1.1. Animals, anesthesia, blood pressure, and IOP perturbation
- •4.3.1.3. Spectral determinant of percentage oxygen saturation
- •4.3.1.5. Preparation and calibration of red blood cell suspensions
- •4.3.2. Results
- •4.3.2.2. Oxygen saturation of the ONH
- •4.3.3. Discussion
- •4.3.4. Conclusions
- •4.4. Experiment Three
- •4.4.1. Methods and Materials
- •4.4.1.1. Compliance testing
- •4.4.1.2. Hyperspectral imaging
- •4.4.1.3. Selection of ONH structures
- •4.4.1.4. Statistical methods
- •4.4.2. Results
- •4.4.2.1. Compliance testing
- •4.4.2.2. Blood spectra from ONH structures
- •4.4.2.3. Oxygen saturation of ONH structures
- •4.4.2.4. Oxygen saturation maps
- •4.4.3. Discussion
- •4.5. Experiment Four
- •4.5.1. Methods and Materials
- •4.5.2. Results
- •4.5.3. Discussion
- •4.6. Experiment Five
- •4.6.1. Methods and Materials
- •4.6.1.3. Automatic control point detection
- •4.6.1.4. Fused image optimization
- •4.7. Conclusion
- •References
- •5.1. Introduction to Thermography
- •5.2. Data Acquisition
- •5.3. Methods
- •5.3.1. Snake and GVF
- •5.3.2. Target Tracing Function and Genetic Algorithm
- •5.3.3. Locating Cornea
- •5.4. Results
- •5.5. Discussion
- •5.6. Conclusion
- •References
- •6.1. Introduction to Glaucoma
- •6.1.1. Glaucoma Types
- •6.1.1.1. Primary open-angle glaucoma
- •6.1.1.2. Angle-closure glaucoma
- •6.1.2. Diagnosis of Glaucoma
- •6.2. Materials and Methods
- •6.2.1. c/d Ratio
- •6.2.2. Measuring the Area of Blood Vessels
- •6.2.3. Measuring the ISNT Ratio
- •6.3. Results
- •6.4. Discussion
- •6.5. Conclusion
- •References
- •7.1. Introduction to Temperature Distribution
- •7.3. Mathematical Model
- •7.3.1. The Human Eye
- •7.3.2. The Eye Tumor
- •7.3.3. Governing Equations
- •7.3.4. Boundary Conditions
- •7.4. Material Properties
- •7.5. Numerical Scheme
- •7.5.1. Integro-Differential Equations
- •7.6. Results
- •7.6.1. Numerical Model
- •7.6.2. Case 1
- •7.6.3. Case 2
- •7.6.4. Discussion
- •7.7. Parametric Optimization
- •7.7.1. Analysis of Variance
- •7.7.2. Taguchi Method
- •7.7.3. Discussion
- •7.8. Concluding Remarks
- •References
- •8.1. Introduction to IR Thermography
- •8.2. Infrared Thermography and the Measured OST
- •8.3. The Acquisition of OST
- •8.3.1. Manual Measures
- •8.3.2. Semi-Automated and Fully Automated
- •8.4. Applications to Ocular Studies
- •8.4.1. On Ocular Physiologies
- •8.4.2. On Ocular Diseases and Surgery
- •8.5. Discussion
- •References
- •9.1. Introduction
- •9.1.1. Preprocessing
- •9.1.1.1. Shade correction
- •9.1.1.2. Hough transform
- •9.1.1.3. Top-hat transform
- •9.1.2. Image Segmentation
- •9.1.2.1. The region approach
- •9.1.2.2. The gradient-based method
- •9.1.2.3. Edge detection
- •9.1.2.3.2. The second-order derivative methods
- •9.1.2.3.3. The optimal edge detector
- •9.2. Image Registration
- •9.4. Automated, Integrated Image Analysis Systems
- •9.5. Conclusion
- •References
- •10.1. Introduction to Diabetic Retinopathy
- •10.2. Data Acquisition
- •10.3. Feature Extraction
- •10.3.1. Blood Vessel Detection
- •10.3.2. Exudates Detection
- •10.3.3. Hemorrhages Detection
- •10.3.4. Contrast
- •10.4.1. Backpropagation Algorithm
- •10.5. Results
- •10.6. Discussion
- •10.7. Conclusion
- •References
- •11.1. Related Studies
- •11.2.1. Encryption
- •11.3. Compression Technique
- •11.3.1. Huffman Coding
- •11.4. Error Control Coding
- •11.4.1. Hamming Codes
- •11.4.2. BCH Codes
- •11.4.3. Convolutional Codes
- •11.4.4. RS Codes14
- •11.4.5. Turbo Codes14
- •11.5. Results
- •11.5.1. Using Turbo Codes for Transmission of Retinal Fundus Image
- •11.6. Discussion
- •11.7. Conclusion
- •References
- •12.1. Introduction to Laser-Thermokeratoplasty (LTKP)
- •12.2. Characteristics of LTKP
- •12.3. Pulsed Laser
- •12.4. Continuous-Wave Laser
- •12.5. Mathematical Model
- •12.5.1. Model Description
- •12.5.2. Governing Equations
- •12.5.3. Initial-Boundary Conditions
- •12.6. Numerical Scheme
- •12.6.1. Integro-Differential Equation
- •12.7. Results
- •12.7.1. Pulsed Laser
- •12.7.2. Continuous-Wave Laser
- •12.7.3. Thermal Damage Assessment
- •12.8. Discussion
- •12.9. Concluding Remarks
- •References
- •13.1. Introduction to Optical Eye Modeling
- •13.1.1. Ocular Measurements for Optical Eye Modeling
- •13.1.1.1. Curvature, dimension, thickness, or distance parameters of ocular elements
- •13.1.1.2. Three-dimensional (3D) corneal topography
- •13.1.1.3. Crystalline lens parameters
- •13.1.1.4. Refractive index
- •13.1.1.5. Wavefront aberration
- •13.1.2. Eye Modeling Using Contemporary Optical Design Software
- •13.1.3. Optical Optimization and Merit Function
- •13.2. Personalized and Population-Based Eye Modeling
- •13.2.1. Customized Eye Modeling
- •13.2.1.1. Optimization to the refractive error
- •13.2.1.2. Optimization to the wavefront measurement
- •13.2.1.3. Tolerance analysis
- •13.2.2. Population-Based Eye Modeling
- •13.2.2.1. Accommodative eye modeling
- •13.2.2.2. Ametropic eye modeling
- •13.2.2.3. Modeling with consideration of ocular growth and aging
- •13.2.2.4. Modeling for disease development
- •13.2.3. Validation of Eye Models
- •13.2.3.1. Point spread function and modulation transfer function
- •13.2.3.2. Letter chart simulation
- •13.2.3.3. Night/day vision simulation
- •13.3. Other Modeling Considerations
- •13.3.1. Stiles Crawford Effect (SCE)
- •13.3.1.2. Other retinal properties
- •13.3.1.4. Optical opacity
- •13.4. Examples of Ophthalmic Simulations
- •13.4.1. Simulation of Retinoscopy Measurements with Eye Models
- •13.4.2. Simulation of PR
- •13.5. Conclusion
- •References
- •14.1. Network Infrastructure
- •14.1.1. System Requirements
- •14.1.2. Network Architecture Design
- •14.1.4. GUI Design
- •14.1.5. Performance Evaluation of the Network
- •14.2. Image Analysis
- •14.2.1. Vascular Tree Segmentation
- •14.2.2. Quality Assessment
- •14.2.3. ON Detection
- •14.2.4. Macula Localization
- •14.2.5. Lesion Segmentation
- •14.2.7. Patient Demographics and Statistical Outcomes
- •14.2.8. Disease State Assessment
- •14.2.9. Image QA
- •Acknowledgments
- •References
- •Index
Computational Methods for Feature Detection in Optical Images
Fig. 2.19. (a) Multi-threshold segmentation, (b) 100 × 100 pixel windowed segmentation, (c) 50 × 50 pixel windowed segmentation, (d) 10 × 10 pixel windowed segmentation.
An effort to segment regions with varying intensities is possible through adaptively setting thresholds in windowed regions of the image. The window size should be small enough so that a given feature should have minimal nonuniform illumination effects. Figure 2.19 demonstrates how window size can alter a simple threshold segmentation of blood vessels for an image with nonuniform illumination effects.
2.3.4. Region-Based Methods for Image Segmentation
Although edge detection and linking methods can provide useful information for boundary detection and segmentation, results are not always reliable when boundaries are obscured or noisy. In most cases, region-based segmentation can provide accurate results without depending on linked boundaries to encapsulate an anatomical retinal feature. We will discuss methods that use discriminatory intra-retinal feature statistics to segment regions from within an image.
2.3.4.1. Region growing
This method takes provided seed point locations and groups surrounding pixels (four or eight-connected neighborhood, for example) together based on predefined statistical similarity. The basic formulation is
n
Ri = R, |
(2.30) |
i=1
where Ri is a connected region and i = 1, 2, . . . , n, Ri ∩ Rj = , for all i and j, i = j, P(Ri) = TRUE for i = 1, 2, . . . , n, and P(Ri Rj ) = FALSE for adjacent Ri and Rj .
65
Michael Dessauer and Sumeet Dua
Similarity measurement, P(Ri), and seed point location decisions are based on domain knowledge of the anatomical feature of interest, usually based on intensity, local mean, standard deviations, or higher level textural statistics. Another necessary parameter is the stopping condition. Although a stopping condition can occur when the similarity measure ceases to find similar pixels, region shape statistics can also be used to improve results when prior feature models are known. A recursive region growing method has been used for the segmentation of yellow lesions, due to their homogenous gray-scale intensity.19 In Fig. 2.20, we show the results of recursive region-growing segmentation on a processed gray-level retinal image when different seed points are chosen. This simple, recursive method can attain powerful segmentation results when anatomical features have continuous regions (as in the image), but fail when regions have large statistical discontinuities from either occlusion or illumination.
2.3.4.2. |
Watershed segmentation |
|
|
|
In |
this |
segmentation approach, |
an |
intensity image is represented in |
a |
3D space where the intensity |
at |
each pixel location (x, y) denotes |
|
height (Fig. 2.21). Although this approach combines operations from both edge detection and morphological operations, the watershed uses regions peaks and valleys (regional maximums and minimums) that would act as
Fig. 2.20. (a) Seed points used in region growing and (b) region growing using seed points and intensity similarity predicate.
66
Computational Methods for Feature Detection in Optical Images
Fig. 2.21. (a) Localized optic disk image, (b) 3D visualization with intensity magnitude in z direction, and (c) sample watershed segmentation.
catchment basins for liquid. Segments are created from this topographical representation, which are connected components lying within a regional minimum and surrounded by a connected regional maximum.
The watershed segmentation algorithm can be conceptualized as follows.20 First, let M1, M2, . . . , Mr be the sets of coordinates of the locations in the regional minima of an image f(x, y), which will be a gradient image calculated from using any of the previous methods. Let C(Mi) be a set of locations of the points in the catchment basin of regional minimum Mi. Let T [n] represent the set of locations (s, t) for which f(s, t) < n, written as:
T [n] = {(s, t)|g(s, t) < n}. |
(2.31) |
T [n] is the set of locations in f(x, y) below the plane f(s, t) < n. The regional area will then be “flooded” in integer increments, from n = min +1 to n = max +1. Now, Cn(Mi) represents the points located in the basin that are below n, which is written as:
Cn(Mi) = C(Mi) ∩ T [n]. |
(2.32) |
This equation gives binary values of one if a location (x, y) belongs to both Cn(Mi) and T [n]. Next, we find
R |
|
|
|
|
|
C[n] = |
Cn(Mi). |
(2.33) |
i=1 |
|
|
Then, C[max +1] is the union of all catchment basins: |
|
|
C[max +1] = |
R |
|
C(Mi). |
(2.34) |
|
|
i=1 |
|
|
|
|
67
Michael Dessauer and Sumeet Dua
As the algorithm begins at C[min +1] = T [min +1], watershed segmentation members are detected recursively, with step n occurring only after C[n − 1] has been found. To find C[n − 1] and C[n], let Q be the set of connected components in T [n], so that, each connected component q Q[n], which is that q ∩ C[n − 1] is empty, contains a single instance of connected components, or contains multiple connected components. C[n] depends on which of these conditions is satisfied:
•Empty set occurring when a new minimum is found, adding q into C[n − 1] to form C[n],
•One connected component, which means q lies within a basin, adding q into C[n − 1] to form C[n], and
•All or part of a peak ridge separating two or more catchment basins
is found; thus, a “damn” is built (one-pixel thick connecting ridges by dilating q ∩ C[n − 1] with a 3 × 3 structuring element of ones, and
constraining the dilation to q.
Watershed transform has been used in retinal image analysis to segment the optic disk, using the red channel of the RGB color space.21 A watershed algorithm result is provided. This result shows how the method uses regional minimums and maximums for segmentation (Fig. 2.21).
2.3.4.3. Matched filter segmentation
In the cases where prior models of retinal anatomy are available, we can create kernels (templates) that are then convolved with the image, finding maximal responses at locations of high template matching. Blood vessels in retinal images have intensity characteristics that allow for successful template modeling, considering their piece-wise linear segments and Gaussian intensity profile.22 Using these assumptions, we can formulate the Gaussian curve as:
h(x, y) = A{1 − k e−d2/2σ2 }, |
(2.35) |
where d is the perpendicular distance between point (x, y) and a straight line passing through the center of the blood vessel along its length, σ gives the intensity profile, and A is the local background intensity. One required step in matched filters that adds to the complexity of this method is that
68
Computational Methods for Feature Detection in Optical Images
the kernel must be rotated to find objects in various orientations. A rotation matrix is used, given by:
r¯i = |
cos θi |
− sin θi |
, |
(2.36) |
sin θi |
cos θi |
where θi is the orientation with a corresponding point in the rotated coordinate system given by:
|
|
|
|
p¯ t = [u v] = pr¯ iT . |
(2.37) |
||
If we divide the orientations into 15◦ increments, 12 kernels are necessary to convolve with the image to find all orientations, with the maximum value chosen for each location (x, y). The Gaussian curve tail is truncated at u = ±3σ, and a neighborhood N is defined as being within u. The weights
in the ith kernel are given as: |
= − |
|
¯ i |
|
|
|
i |
(x, y) |
e−u2/2σ2 |
N. |
(2.38) |
||
K |
|
p |
||||
A will denote the number of points in N, with a mean value, mi, of the kernel given as:
mi = |
¯ |
Ki(x, y) |
(2.39) |
|
|
. |
|||
pi N |
A |
|||
|
|
|
|
|
The convolution mask is then given by:
i |
= i |
− |
m |
i |
|
p |
i |
|
N. |
(2.40) |
K (x, y) |
K (x, y) |
|
|
|
|
|
In Fig. 2.22, we present the above example of matched filter kernels and segmentation result using a simple magnitude threshold of the maximum
Fig. 2.22. (a) 3D representation of a matched filter used for blood vessel detection, (b) filter bank for 15◦ orientation increments, and (c) vessel segmentation results using low bound threshold.
69
