Добавил:
kiopkiopkiop18@yandex.ru t.me/Prokururor I Вовсе не секретарь, но почту проверяю Опубликованный материал нарушает ваши авторские права? Сообщите нам.
Вуз: Предмет: Файл:
Ординатура / Офтальмология / Английские материалы / Computational Analysis of the Human Eye with Applications_Dua, Acharya, Ng_2011.pdf
Скачиваний:
0
Добавлен:
28.03.2026
Размер:
20.45 Mб
Скачать

Rajendra Acharya, U. et al.

classes at the output. However, the network was trained to identify only three classes (normal, NPDR, and PDR) given by the decoded binary outputs of [00, 10, 01], respectively. Values of the blood vessels, exudates, hemorrhages, and contrast of the image were computed and fed as input to the classifier.

10.5. Results

Table 10.2 shows the analysis of variance (ANOVA) results of features extracted for normal, NPDR, and PDR images. These results show that the blood vessel area is more for NPDR and less for PDR. Exudates are absent for normal and present for NPDR. The hemorrhage area is larger for PDR, and the contrast is high for normal fundus images. All features are clinically significant (p < 0.0001).

Table 10.3 shows the classification results of the ANN classifier. We have used 90 images for training and 30 for testing. Our proposed system

Table 10.2. Results of area of blood vessels, exudates, hemorrhages, and contrast.

Features

PDR

NPDR

Normal

p-value

 

 

 

 

 

 

 

Blood vessel

51,171 ± 20,652

343,975 ± 14,890

325,000 ± 20,278

p < 0.0001

area

 

± 3,149

 

± 2,106

 

 

 

Exudate area

8,148

8,776

p < 0.0001

Hemorrhage

5,103

± 4,329

2,213

± 2,107

p < 0.0001

area

 

± 0.0212

 

± 0.0182

0.0867 ± 0.0136

 

 

Contrast

0.0698

0.0691

p < 0.0001

 

 

Table 10.3. Results of automatic classification.

 

 

 

 

 

 

 

 

No. of data used No. of data used Overall percentage

 

Class

 

for training

 

for testing

of success

 

 

 

 

 

 

 

 

 

 

 

 

Normal

 

30

 

10

100.00

 

 

 

PDR

 

30

 

10

90.00

 

 

 

Non-PDR

30

 

10

100.00

 

 

 

Average

 

 

 

 

96.67

 

 

 

 

 

 

 

 

 

 

 

312

Computer-Aided Diagnosis of Diabetic Retinopathy Stages

classifies all normal and NPDR images correctly, and PDR images are classified correctly with an accuracy of 90%. The average classification accuracy is 96.67%. Table 10.4 shows the results of sensitivity, specificity, and positive predictive accuracy for the proposed system. The table shows

Table 10.4. Results of sensitivity, specificity, and positive predictive accuracy for the proposed system.

 

 

 

 

Positive

 

 

True

True

False

False

predictive

 

 

positive

negative

positive

negative

value

Sensitivity

Specificity

 

 

 

 

 

 

 

10

19

1

0

90.91%

100%

95%

 

 

 

 

 

 

 

Normal

Non-Proliferative

Proliferative

(a)

(b)

Fig. 10.8. Results of (a) blood vessel detection and (b) exudate detection for normal, NPDR, and PDR images using image processing.

313

Rajendra Acharya, U. et al.

that sensitivity and specificity of the proposed system are 100% and 95%, respectively.

Figure 10.8 (a–c) show the results of blood vessel detection, exudates detection, and hemorrhage detection for normal, NPDR, and PDR images. These figures show that there are fewer blood vessels for normal and more for NPDR and PDR stages. Exudates and hemorrhages do not exist in normal but they are present for NPDR and PDR stages.

10.6. Discussion

A computer diagnostic system was developed to detect three early lesions: hemorrhage MAs, hard exudates, and cotton-wool spots, and to classify NPDR based on these three types of lesions using 361 images.19 The correct diagnosis rates, between computer system and reading center, for determining each lesion were 82.6%, and 88.3% for hemorrhages and MAs, hard exudates, and cotton-wool spots, respectively. The results from the proposed classification system were comparable to those provided by human experts, and can be used as a clinical aid to physicians for screening, diagnosing, and detecting NPDR.

A decision support system for the early diagnosis of DR by detecting the presence of MAs was developed by Kahai et al.20 Their results show that their support system was able to achieve sensitivity and specificity of 100% and 67%, respectively.

Four retinal conditions: normal retina, moderate NPDR, severe NPDR, and PDR were automatically classified using the area and perimeter of blood vessels of red, green, and blue layers of the images and a neural network classifier.21 Their proposed method demonstrated an accuracy of more than 80%, sensitivity, and specificity of more than 90% and 100%, respectively.

Using blood vessels, exudates, and texture parameters as well as feedforward neural network, three classes: normal, NPDR, and PDR were automatically classified.15 They identified the unknown class with an accuracy of 93%, and a sensitivity and specificity of 90% and 100%, respectively.

A low-cost screening method to identify normal and abnormal fundus images, based on exudates and lesions, was proposed.22 The classification was performed with a statistical classifier and a local-window-based

314

Computer-Aided Diagnosis of Diabetic Retinopathy Stages

verification strategy. Their method identified all retinal images with exudates with 100% accuracy and normal images as normal with 70% accuracy.

The normal, as well as mild, moderate, severe NPDR, and PDR classes were automatically classified using higher order spectra features and the SVM classifier.23 Acharya et al. demonstrated sensitivity and specificity of 82% and 88%, respectively using 300 digital fundus images.

Features, namely, blood vessels, exudates, MAs, and hemorrhages were extracted and assembled into an input vector before being fed to an SVM classifier. This classifier had to identify five groups: normal retina, mild NPDR, moderate NPDR, severe NPDR, and PDR.16 They identified the unknown class with sensitivity and specificity of 82% and 86%, respectively.

In our present work, four features, namely, blood vessels, exudates, hemorrhages, and textures were used. These parameters were presented to a neural network for automated classification. Our results show an accuracy of 96.6%, sensitivity of 100%, and specificity of 95%. The classification accuracy can be further improved by using better features, more diverse fundus images taken under good lighting conditions, and better classifiers.

10.7. Conclusion

DR is a progressive eye disease caused by prolonged diabetes. It can result in vision loss, if not detected at an early stage. In this work, we have proposed an automated system to identify normal, NPDR, and PDR fundus images using image-processing and data-mining techniques. The proposed system can identify normal, NPDR, and PDR accurately with an accuracy of 96.67%, sensitivity of 100% and specificity of 95%. Our results show that the system can help ophthalmologists to automatically identify early stage DR. Additionally, ophthalmologists can use this system as an adjunct tool for screening.

References

1.Klein, R., Klein, B.E., and Moss, S.E. Visual impairment in diabetes. Ophthalmology 91:1–9, 1984.

2.Eye Disease: http://visioneyedoctor.com/eye-disease/page-4.htm.

3.Frank, R.N. Diabetic retinopathy. Prog Retin Eye Res 14:361–392, 1995.

315

Rajendra Acharya, U. et al.

4.Ong, G.L., Ripley, L.G., Newsom, R.S., Cooper, M., and Casswell, A.G. Screening for sight-threatening diabetic retinopathy: comparison of fundus photography with automated color contrast threshold test. Am J Ophthalmol 137:445–452, 2004.

5.Fong, D.S., Aiello, L., Gardner, T.W., King, G.L., Blankenship, G., Cavallerano, J.D., Ferris, F.L., and Klein, R. Diabetic retinopathy. Diabetes Care 26:226–229, 2003.

6.http://reseau-ophdiat.aphp.fr/Document/Doc/confliverpool.pdf.

7.Hayashi, J., Kunieda, T., Cole, J., Soga, R., Hatanaka,Y., Lu, M., Hara, T., and Fujita, H. A development of computer-aided diagnosis system using fundus images. In: Proceeding of the Seventh International Conference on Virtual Systems and MultiMedia (VSMM 2001), pp. 429–438, 2001.

8.Xiaohui, Z. and Chutatape, O. Detection and classification of bright lesions in colour fundus images. Int Conf Image Process 1:139–142, 2004.

9.Gardner, G., Keating, D., Williamson, T., and Elliott, A. Automatic detection of diabetic retinopathy using an artificial neural network: a screening tool. Br J Ophthalmol 80:940–944, 1996.

10.Niemeijer, M., van Ginneken, B., Staal, J., Suttorp-Schulten, M., and Abramoff, M. Automatic detection of red lesions in digital color fundus photographs. IEEE Trans Med Imaging 24:584–592, 2005.

11.Li, H., Hsu, W., Lee, M.L., and Wong, T.Y. Automated grading of retinal vessel caliber.

IEEE Trans Biomed Eng 52:1352–1355, 2005.

12.Vallabha, D., Dorairaj, R., Namuduri, K.R., and Thompson, H.Automated detection and classification of vascular abnormalities in diabetic retinopathy. Thirty-Eighth Asilomar Conference on Signals, Systems and Computers, 2004.

13.Gonzalez, R.C. and Woods, R.E. Digital Image Processing, Prentice Hall, New Jersey, Second Edition, 2001.

14.Acharya, U.R., Lim, C.M., Ng, E.Y.K., Chee, C., and Tamura, T. Computer based detection of diabetes retinopathy stages using digital fundus images. J Eng Med 223(H5):545–553, 2009.

15.Nayak, J., Bhat, P.S., Acharya, U.R., Lim, C.M., and Kagathi, M. Automated identification of different stages of diabetic retinopathy using digital fundus images. J Med Sys 32:107–115, 2008.

16.Sinthanayothin, C., Boyce, J., and Williamson, C.T. Automated localization of the optic disk, fovea, and retinal blood vessels from digital colour fundus images. Br J Ophthalmol 38:902–910, 1999.

17.Randle, V. and Engler, O. Introduction to Texture Analysis: Macrotexture, Microtexture and Orientation Mapping, RCR Press, 2000.

18.Yegnanarayana, B. Artificial Neural Networks, Prentice-Hall of India, New Delhi, 1999.

19.Samuel, C.L., Elisa, T.L., Yiming, W., Ronald, K., Ronald, M.K., and Ann, W. Computer classification of a nonproliferative diabetic retinopathy. Arch Ophthalmol 123:759–764, 2005.

20.Kahai, P., Namuduri, K.R., and Thompson, H. A decision support framework for automated screening of diabetic retinopathy. Int J Biomed Imaging: 1–8, 2006.

21.Wong, L.Y., Acharya, U.R., Venkatesh, Y.V., Chee, C., Lim, C.M., and Ng, E.Y.K. Identification of different stages of diabetic retinopathy using retinal optical images. Inf Sci 178, 2008

316

Computer-Aided Diagnosis of Diabetic Retinopathy Stages

22.Wang, H., Hsu, W., Goh, K., and Lee, M. An effective approach to detect lesions in colour retinal images. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 181–187, 2000.

23.Acharya, U.R., Chua, K.C., Ng, E.Y.K., Wei, W., and Chee, C. Application of higher order spectra for the identification of diabetes retinopathy stages. J Med Sys 32:481–488, 2008.

317

This page intentionally left blank

Chapter 11

Reliable Transmission of Retinal Fundus Images with Patient Information Using Encryption, Watermarking, and Error Control Codes

Myagmarbayar Nergui , Sripati Acharya, U. , Rajendra

Acharya, U., Wenwei Yuand Sumeet Dua§

Today, digital media are important to many aspects of entertainment, business, and medicine. Many service and sales providers, broadcast companies, wireless cellular phone service providers, and entertainment electronics companies use digital signal processing (DSP) techniques to improve their quality of service (QoS). Ophthalmologists and other eye care clinicians can leverage this trend, especially with eye images, which can be transmitted along with patient information. Patient diagnosis and image information are worked on and stored digitally. Presently, patient diagnosis (text) and retinal fundus image information compose two forms data and are read as different files. Hence, a patient diagnosis may be matched to the wrong information. This interchanging of patient information can be prevented by embedding patient diagnosis and retinal fundus image

Department of Electronics & Communication, National Institute of Technology Karnataka, Surathkal, India.

Department of ECE, Ngee Ann Polytechnic, Singapore.

Graduate School of Medical System Engineering, Chiba University, Japan.

§Department of Computer Science, Louisiana Tech University, Ruston, LA 71272, USA.

319