Ординатура / Офтальмология / Английские материалы / Handbook of Optical Coherence Tomography_Bouma, Tearney_2002
.pdfFull-Field OCM |
331 |
better electronics [34]. With 0.95 NA objective lenses, the lateral resolution, almost limited by diffraction, was about 0:5 m. Reducing the optical wavelength to 400 nm using a blue LED would improve the lateral resolution to approximately 0:25 m.
11.6CONCLUSION AND PERSPECTIVES
Full-field illumination and parallel detection have been demonstrated with optical coherence microscopy (OCM). Using high NA objective lenses, the technique allows imaging with micrometer spatial resolution in the three directions and 20 ms temporal resolution.
This approach is attractive for the following reasons:
Head-on (XY) geometry allows high resolution imaging. Three-dimensional submicrometer resolution is achieved using a high-NA objective lens. In contrast with XZ OCT imaging, the resolution is the same everywhere in the full XY image, because the whole field of view is in the plane of focus of the objective lens.
Head-on (XY) geometry allows adjustment of the acquisition time and of the intensity irradiating the sample according to the imaging depth.
If the source power is not limited, parallel detection requires a shorter amount of time than serial detection to achieve a given SNR in the image. In the coming years it is expected that more powerful light sources and faster cameras will make it possible to take full advantage of this approach.
Scanning and associated vibrations are reduced to a minimum. Simultaneous acquisition allows the acquisition of a complete image to be
synchronized to a physiological event.
There are some limitations of this method that may make it unsuitable for certain applications:
The achievable temporal resolution per pixel is limited to one-fourth of the camera frame rate ( milliseconds).
Confocal filtering cannot be implemented in a full-field detection as in scanning OCM systems. However, the interference microscope geometry formally plays the same role as a confocal pinhole (see Section 11.4.1).
Artifacts may be present when imaging samples with strong birefringence and backscattering polarization anisotropy with a polarization interferometer. However, the layouts presented here can easily be modified to become polar- ization-sensitive and then be used to measure sample birefringence.
One of our main motivations was to explore layouts allowing high resolution OCT imaging in the three dimensions. The use of high NA objective lenses in an XY (en face) imaging geometry effectively provides 3-D submicrometer resolution, which makes it an alternative approach to XZ imaging with spectrally wide sources [35]. Furthermore, parallel detection in coherence microscopy is a tempting approach, provided that one is not limited by the available power. We are currently investigating the use of white light sources, CCD cameras with higher full well capacities and frame rates, and fast modulators (liquid crystal, 1 MHz) to improve the sensitivity and temporal resolution of these systems.
332 |
Saint-Jalmes et al. |
ACKNOWLEDGMENTS
We thank J. Mertz for many fruitful discussions. This research was supported by the Centre National de la Recherche Scientifique (CNRS) and by the Direction Ge´ne´rale de l’Armement (DGA).
REFERENCES
1.Wilson T, Sheppard CJR. Theory and Practice of Scanning Optical Microscopy. New York: Academic, 1984.
2.Pawley J ed. The Handbook of Confocal Microscopy. 2nd ed. New York: Plenum Press, 1995: 445.
3.Wilson T. Confocal Microscopy. London: Academic, 1990.
4.Huang D, Swanson EA, Lin CP, Schuman JS, Stinson WG, Chang W, Hee MR, Flotte T, Gregory K, Puliafito CA, Fujimoto JG. Optical coherence tomography. Science 254:1178, 1991.
5.Izatt JA, Hee MR, Owen GM, Swanson EA, Fujimoto JG. Optical coherence microscopy in scattering media. Opt Lett 19:590, 1994.
6.Tearney GJ, Bouma BE, Boppart SA, Golubovic B, Swanson EA, Fujimoto JG. Rapid acquisition of in vivo biological images by use of optical coherence tomography. Opt Lett 21:1408, 1996.
7.Tearney GJ, Brezinski ME, Bouma BE, Boppart SA, Pitris C, Southern JF, Fujimoto JG. In vivo endoscopic optical biopsy with optical coherence tomography. Science 276:2037, 1997.
8.Boppart SA, Brezinski MA, Bouma BE, Tearney GJ, Fujimoto JG. Investigation of developing embryonic morphology using optical coherence tomography. Dev Biol 177:54, 1996.
9.Schmitt JB, Yadlowsky MJ, Bonner RF. Subsurface imaging of living skin with optical coherence microscopy. Dermatology 191:93, 1995.
10.Izatt JA, Kulkarni MD, Want H-W, Kobayashi K, Sivak MV Jr. Optical coherence tomography and microscopy in gastrointestinal tissues. IEEE J Selected Topics Quantum Electron 2:1017, 1996..
11.Kempe M, Rudolph W, Welsch E. Comparative study of confocal and heterodyne microscopy for imaging through scattering media. J Opt Soc Am A 13:46, 1996.
12.Beaurepaire E, Moreaux L, Amblard F, Mertz J. Combined scanning optical coherence and two-photon excited fluorescence microscopy. Opt Lett 24:969, 1999.
13.Beaurepaire E, Boccara AC, Lebec M, Blanchot L, Saint-Jalmes H. Full-field optical coherence microscopy. Opt Lett 23:244, 1998.
14.Hinshaw WS. Image formation by NMR: The sensitive-point method. J Appl Phys 47:3709, 1976.
15.Lauterbur PC. NMR Zeugmatographic imaging by true 3D reconstruction. J Comput Assist Tomogr 5:285, 1981.
16.Boppart SA, Bouma BE, Pitris C, Southern JF, Brezinski ME, Fujimoto JG. In vivo optical coherence tomography cellular imaging. Nature Med 4(7):861, 1998.
17.Schmitt JM, Knu¨ttel A, Gandjbakhche A, Bonner RF. Optical characterization of dense tissues using low-coherence interferometry. SPIE Proc 1889:197, 1989.
18.Yariv A. Optical Electronics. 4th ed. Saunders, Philadelphia, 1991.
19.Schmitt JM, Knu¨ttel A. Model of optical coherence tomography of heterogeneous tissue. J Opt Soc Am A 14:1231, 1997.
20.Yadlowsky MJ, Schmitt JM, Bonner RF. Multiple scattering in optical coherence microscopy. Appl Opt 34(25):5699, 1995.
Full-Field OCM |
333 |
21.Boccara AC, Charbonnier F, Fournier D, Gleyzes P. French Patent FR90.092255 and international extensions (1990).
22.Gleyzes P, Boccara AC, Saint-Jalmes H. Multichannel Nomarski microscope with polarization modulation: Performance and applications. Opt Lett 22(20):1529, 1997.
23.Kino GS, Chim SC. Mirau correlation microscope. Appl Opt 29:3775, 1990.
24.Dubois A, Boccara AC, Lebec M. Real-time reflectivity and topography imagery of depth-resolved microscopic surfaces. Opt Lett 24:309, 1999.
25.Davidson M, Kaufman K, Mazor I, Cohen F. An application of interference microscopy to integrated circuit inspection and metrology. In: KM Monahan, ed. Integrated Circuit Metrology, Inspection, and Process Control. Proc. SPIE 775:233, 1987.
26.Schmitt JM, Hee SL, Yung KM. An optical coherence microscope with enhanced resolving power in thick tissue. Opt Commun 142:203, 1997.
27.Badoz J, Billardon M, Canit JC, Russel MF. Sensitive devices to determine the state and degree of polarization of a light beam using a birefringent modulator. J Opt (Paris) 8:373, 1977.
28.Canit JC, Badoz J. New design for a photoelastic modulator. Appl Opt 22:592, 1983.
29.Le´veˆque S, Boccara AC, Lebec M, Saint-Jalmes H. Ultrasonic tagging of photons paths in scattering media: Parallel speckle modulation processing. Opt Lett 24:181, 1999.
30.Forget BC, Grauby S, Fournier D, Gleyzes P, Boccara AC. High resolution AC temperature field imaging. Electron Lett 33:1688, 1997.
31.Sheppard CJR, Wilson T. Effects of high angles of convergence on V(z) in the scanning acoustic microscope. Appl Phys Lett 38:858–859, 1981.
32.Richards B, Wolf E. Electromagnetic diffraction in optical systems: II. Structure of the image field in an aplanetic system. Proc Soc Lond Ser A 253:358, 1959.
33.Chang FC, Kino GS. 325-nm interference microscope. Appl Opt 37:3471, 1998.
34.Laeri F, Strand TC. Angstrom resolution optical profilometry for microscopic objects. Appl Opt 26:2245, 1987.
35.Drexler W, Morgner U, Ka¨rtner FX, Pitris C, Boppart SA, Li XD, Ippen EP, Fujimoto JG. In vivo ultrahigh-resolution optical coherence tomography. Opt Lett 24:1221, 1999.
This Page Intentionally Left Blank
12
Spectral Radar: Optical Coherence
Tomography in the Fourier Domain
¨
M. W. LINDNER, P. ANDRETZKY, F. KIESEWETTER, and G. HAUSLER
University of Erlangen-Nuernberg, Erlangen, Germany
12.1INTRODUCTION
This chapter will discuss white light interferometry within scattering media. White light interferometry at nonspecular surfaces was discovered only about 10 years ago [1]. The physics of white light interferometry at ‘‘rough’’ surfaces or within scattering objects is rather different from Michelson’s white light interferometry. Interferometry on non-smooth objects has to satisfy strong requirements about spatial coherence, temporal coherence, observation aperture, and lateral photodetector resolution.
In white light interferometry on rough or within scattering objects, the subjective speckle pattern in the observation plane is the source of information. Hence we have to take care that the aperture of illumination is smaller than the aperture of observation, in order to create sufficient speckle contrast. Furthermore, the photodetector elements have to be not much larger than the speckles, in order to avoid speckle reduction by averaging. One serious difference from smooth surface interferometry is the achievable accuracy: The interference phase within each speckle is arbitrary and random. The phase variation between different speckles depends on the roughness and may be as great as 10–20 2 for ground or milled surfaces. Hence, the achievable measuring uncertainty is limited and is equal to the surface roughness (which is usually much smaller than the coherence length of the source) [2].
Measurements within the bulk of strongly scattering media require conditions similar to those mentioned above. Specifically, the source of information is still the speckle contrast. One major difference from rough surface measurements is that this
335
336 |
Lindner et al. |
speckle contrast is usually extremely small if the measured volume is deeper than the coherence length of the source. In those objects, we observe a measuring uncertainty and a depth resolution not much better than the coherence length.
During the last 10 years, our group developed a couple of sensors based on white light interferometry on rough surfaces or within scattering media. The first was ‘‘coherence radar’’ [1,2]. Here the reference is mechanically scanned through the measuring volume. The name ‘‘coherence radar’’ suggests itself because it essentially measures the local time of flight to each object pixel by use of the coherent reference ‘‘clock.’’ The sensor can measure the shape of rough surfaces with an accuracy equal to the surface roughness, which is usually in the 1 m regime. The objects may be volume scatterers such as ceramics or skin. One exciting feature of coherence radar is that the measuring uncertainty does not depend upon the object distance or the observation aperture: We can measure in deep boreholes without loss of accuracy.
One modification of coherence radar is ‘‘dispersion radar’’ [3]. Here we introduce a dispersive element within the reference arm before we spectrally evaluate the interferometer output. The major advantage of this method is that we can have a large measuring range even with a low cost spectrometer.
The last modification is what we call ‘‘spectral radar’’ [4], which will be discussed in this chapter. We hope that this short introduction has given the reader some idea of the physics of this new type of interferometry and its potential applications.
12.2BENEFITS AND LIMITATIONS OF LOW COHERENCE INTERFEROMETRY
An important medical aim in dermatology is the early diagnosis of pathological tissue alterations (e.g., skin cancer). High resolution imaging methods are needed that are not harmful to healthy tissue. During recent years noninvasive cross-sec- tional imaging methods under the heading ‘‘optical tomography’’ have been developed. The sensing methods are manifold but are based on a common feature: The tissue (which is a volume scatterer) is illuminated, and the number of photons scattered back to the detector is measured as a function of the pathlength in the tissue. This principle gives, to a certain extent, access to the scattering amplitude aðzÞ, which is the key to examining the local scattering and absorption behavior in the tissue. The hope is that pathological tissue displays significant scattering properties and can therefore be separated from healthy tissue. The pathlength distribution of the photons can be measured directly by time-of-flight measurements or by using low coherence interferometry.
Optical coherence tomography (OCT) uses a broad bandwidth light source in an interferometric setup. Interference is detected only if the object pathlength equals the reference pathlength. The pathlength of the photons to be detected (backscattered from the tissue) can be adjusted by the reference pathlength. The main advantage of OCT is that the pathlength resolution is roughly given by the coherence length lC of the light, which can be in the micrometer range.
The major problem in optical tomography is the strong scattering of most types of biological tissue (more in skin than in the retina). As a result of multiple scattering, photons that have traveled the same optical pathlength may have traveled
Spectral Radar |
337 |
along different individual paths reaching different depths in the scatterer. This affects the depth uncertainty of the measurements. By confocal imaging of backscattered photons onto a fiber core, photons that have been scattered several times and with low correlation between run time and depth information will no longer hit the fiber.
We again mention one important aspect of the signal formation: Imaging with a finite aperture causes subjective speckle at the entrance of the fiber. Speckle contrast is the actual carrier of information. Because of the strong incoherent background, this speckle contrast is extremely low for strongly scattering media. Sophisticated electronics and data processing are necessary. As a consequence, the requirements for an extremely high dynamic range limit access to the deeper layers of the skin. So consideration of the achievable dynamic range will play an important role in the comparison of competing methods.
12.3THE FAMILY OF OCT SENSORS
In order to investigate the morphological 3-D data of biological objects, methods based on optical coherence tomography (OCT) have become more and more important during recent years. The OCT methods can be divided into two classes, with sensors based on time domain measuring principles (TDOCT) or on Fourier domain principles (FDOCT).
Both TDOCT and FDOCT use a broad-bandwidth light source in an interferometric setup. For measurement along an axis (z axis) from the surface into the bulk (A-scan) with TDOCT, the reference mirror has to be scanned through the depth. Interference contrast is detected only if the object pathlength equals the reference pathlength. The pathlength of the photons to be detected can be adjusted by the reference pathlength. In TDOCT the scatterers are measured sequentially. Therefore, light that is scattered back from each scatterer contributes to the interference signal only if the distance between the reference pane and the scatterer is less than the coherence length. Only a fraction of the light that is scattered back during the entire measuring time is utilized. Time domain methods have been investigated in a multitude of modifications [5–15].
Fourier domain principles avoid scanning of the reference through the depth range. These OCT sensors acquire depth information by evaluation of the spectrum of the interferogram. The Fourier transformation of the spectrum delivers the depth information. All scatterers are simultaneously measured by FDOCT sensors. Light that is scattered back from each scatterer within the volume contributes to the interference signal over the entire measuring time. For sensors of this type there are several approaches. A broad-bandwidth light source is used for the illumination of the interferometer. The interferometer output is spectrally decomposed, and the whole spectrum is detected by an array of photodiodes. This specific implementation can be adapted for measurements on the (transparent) eye [5–7] as well as for measurements of strongly scattering skin [4,16–19]. In a further modification, the spectrum can be produced by a tunable laser and then be detected by a single photodiode [20,21].
In both classes of OCT sensors the speckle contrast is the actual source of information [22,23].
338 |
Lindner et al. |
12.4SPECTRAL RADAR IN A NUTSHELL
As explained in the Introduction, we called our implementation of FDOCT ‘‘spectral radar’’ because it is one modification of coherence radar. Spectral radar uses an OCT sensor working in the Fourier domain [4,18,19]. The measuring principle is based on spectral interferometry. Spectral radar measures the scattering amplitude aðzÞ along one A-scan within one detector exposure. No scanning in depth is necessary, so a short measurement time is generally possible. For two-dimensional imaging a transverse scan is necessary (B-scan). The sensor is a Michelson interferometer (Fig. 1) coupled with a spectrometer.
In one specific implementation, the light source is a superluminescent diode (SLD) in the near-infrared range with a short coherence length lC (central wavelength ¼ 840 nm, FWHM 20 nm, coherence length lC ¼ 35 m; output power P ¼ 1:7 mW). The SLD is imaged onto the object surface and onto a reference mirror. The signal from the object consists of many elementary waves emanating from different depths z. We neglect the dispersion in the object. The scattering amplitude of the elementary waves versus depth is aðzÞ [aðzÞ is assumed to be real here]. The object signal is superimposed with the plane reference wave aR. At the exit of the interferometer we locally separate the different wavenumbers kð¼ 2=Þ by use of a spectrometer. The interference signal IðkÞ is in principle [4,18] (a detailed description is given in Section 12.5.1)
1 |
|
IðkÞ ¼ SðkÞ 1 þ ð1 aðzÞ cosð2knzÞ dz þ AC terms |
ð1Þ |
where SðkÞ is the spectral intensity distribution of the light source and n is the refractive index of the scatterer.
It can be seen that IðkÞ is a sum of three terms. Besides a constant offset the second term encodes the depth information of the object. It is a sum of cosine functions, where the amplitude of each cosine is proportional to the scattering amplitude aðzÞ. The depth z of the scattering event is encoded in the frequency 2nz of the cosine function. This term describes the well-known Mu¨ller fringes of spectral interferometry [24]. It will be seen that aðzÞ can be acquired by a Fourier transfor-
Figure 1 Basic principle of the setup of spectral radar. SLD, superluminescent diode; R, reference; REo, reference plane in the object arm; G, grating spectrometer; PDA, photodiode array; aðzÞ, scattering amplitude.
Spectral Radar |
339 |
mation of the interferogram [25]. The third, autocorrelation term (AC terms), describes the mutual interference of all elementary waves [Eq. (5)].
The main feature of all FDOCT sensors is that the total distribution of the scattering amplitude aðzÞ along one A-scan is measured at one time. The light that is scattered back from each scatterer in the volume contributes to the interference signal during the entire measuring time. This is the crucial difference between FDOCT sensors and TDOCT sensors. For an A-scan by TDOCT sensors, the reference mirror has to be scanned through the depth. The scatterers are measured sequentially. Therefore, only a fraction of the light that is scattered back during the total measuring time can be considered.
These differences between the two techniques have a strong influence on the dynamic range, which is the ratio of the maximum and minimum measurable power P of the signal emanating from the object. It has been shown that the shot noise of the photons is the main physical limitation of the dynamic range [18,19]. From this fact it can be deduced that the dynamic range is limited only by the number of photons from the object that contribute to the interference signal within the measuring time. The dynamic range of FDOCT sensors is [18,19]
¼ |
SNRF2 |
|
h |
|
ð Þ |
|
4 |
|
POt |
|
|
DFD |
10 log |
|
|
|
2 |
where SNRF is the minimal necessary signal-to-noise-ratio (which will be set to a value of 2 later), PO is the total power from the object, h is the energy of one photon, and is the quantum efficiency of the photodiode.
Here the central result has to be emphasized: The more photons from the object contribute to the interference signal, the higher is the dynamic range. Therefore FDOCT sensors have a higher dynamic range than TDOCT sensors, because in FDOCT the light from all scatterers contributes to the signal during the total measuring time, whereas in TDOCT only a fraction of the backscattered light causes interference contrast. The dynamic range of FDOCT can in principle be greater by typically 14 dB. We summarize these and other considerations in Table 1.
The speed of both methods is presently limited by the exposure time needed to integrate sufficient photons or, in other words, by the power of the source. If strong sources eventually become available, the speed of TDOCT might be limited by the mechanics of the depth scan.
One interesting further difference is the requirement for the spectral quality of the source. The correlogram AðzÞ [see Section 12.5.1 and Eq. (9)], which is the spatial impulse response of both types of sensors, is the Fourier transform of the source spectrum SðkÞ. If SðkÞ is not a soft function (such as a Gaussian) but is disturbed by some high frequency modulation, then AðzÞ displays peaks far away from z ¼ 0. Because we want to see extremely small signals deep in the skin, even small peaks of AðzÞ are extremely disturbing, because these artifacts cannot be distinguished from real structure.
Here FDOCT offers a solution: We can compensate for a non-Gaussian spectrum by dividing IðkÞ by SðkÞ [see Eq. (9)]. SðkÞ can be a easily measured in a separate step in our system. This is not possible with TDOCT. It should be mentioned that the unavailability of good sources is the bottleneck of optical tomogra-
340 |
|
Lindner et al. |
Table 1 Fourier Domain OCT vs. Time Domain OCT |
|
|
|
|
|
Parameter |
FD OCT |
TD OCT |
|
|
|
Dynamic range |
DSD ¼ 10 log N0 ¼ DTD þ 14 dB |
DTD ¼ 10 log N0 10 logð z=lCÞ |
Speed |
Limited by source |
Limited by source |
Source requirements |
Low spectral quality |
High spectral quality |
Vibration sensitivity |
Sensitive for T > 10 ms |
Not very sensitive |
Technology |
Sophisticated |
Less sophisticated, mechanical |
|
|
scan |
|
|
|
phy. A smooth spectrum SðkÞ is difficult to generate, because of the high coherence: Longitudinal modes are unavoidable owing to parasitic reflections at the different surfaces of the system.
Finally, we should mention some drawbacks of FDOCT. All waves that are scattered within the volume display interference during the exposure time. Hence, FDOCT is more sensitive to vibration, or moving scatterers, than TDOCT. In TDOCT we measure only a fraction of the volume at the same time; hence our system is less sensitive to motion artifacts by a factor of the ratio of coherence length to measured depth. However, if the total measuring time is less than 10 ms 1 ms, we found that the interference patterns appear stable.
Finally, there is one disadvantage to the use of FDOCT: the considerable technical effort needed for the high resolution spectrometer. This disadvantage comes specifically into play for larger wavelengths (e.g., 1300 nm) that cannot be detected by silicon technology.
12.5PHYSICS OF SPECTRAL RADAR
12.5.1 Fiber-Optic Setup
For two-dimensional imaging, transverse scanning is necessary (B-scan). In order to perform in vivo measurements at different sites in the human body, spectral radar is implemented as a (single-mode) fiber interferometer [4]. The sensor is a modified Michelson interferometer (Fig. 2). The light source is a near-infrared superluminescent diode (SLD). In order to find the location of the measurement on the skin we use an additional pilot laser in the visible range. The light is coupled into the interferometer by a 50/50 fiber coupler.
In the reference arm we focus the beam onto a reference mirror. In the object arm the same combination of lenses is used to focus the light onto the skin. We use this combination to have two degrees of freedom in the setup. First we need to adjust the optical pathlength in both arms. The reference plane is positioned in a distance z0 of about 200 m in front of the object surface, in order to get rid of the source spectrum (‘‘correlogram’’) and the autocorrelation terms (described below). Second, we can vary the position of the focus of the illumination beam within the skin. The light is focused into the skin at a depth of about 200 m. The diameter of the spot at the surface is about 50 m, and the power in the focus is about 360 W. The light is focused within the skin in order to enhance the interference contrast in deeper
