Добавил:
Опубликованный материал нарушает ваши авторские права? Сообщите нам.
Вуз: Предмет: Файл:

Теория информации / Gray R.M. Entropy and information theory. 1990., 284p

.pdf
Скачиваний:
28
Добавлен:
09.08.2013
Размер:
1.32 Mб
Скачать

BIBLIOGRAPHY

279

[59]R. M. Gray, D. S. Ornstein, and R. L. Dobrushin. Block synchronization,sliding-block coding, invulnerable sources and zero error codes for discrete noisy channels. Ann. Probab., 8:639{674, 1980.

[60]R. M. Gray, M. Ostendorf, and R. Gobbi. Ergodicity of Markov channels. IEEE Trans. Inform. Theory, 33:656{664, September 1987.

[61]R. M. Gray and F. Saadat. Block source coding theory for asymptotically mean stationary sources. IEEE Trans. Inform. Theory, 30:64{67, 1984.

[62]P. R. Halmos. Lectures on Ergodic Theory. Chelsea, New York, 1956.

[63]G. H. Hardy, J. E. Littlewood, and G. Polya. Inequalities. Cambridge Univ. Press, London, 1952. Second Edition,1959.

[64]R. V. L. Hartley. Transmission of information. Bell System Tech. J., 7:535{563, 1928.

[65]E. Hoph. Ergodentheorie. Springer-Verlag, Berlin, 1937.

˜

[66] K. Jacobs. Die Ubertragung diskreter Informationen durch periodishce und fastperiodische Kanale. Math. Annalen, 137:125{135, 1959.

˜

[67] K. Jacobs. Uber die Struktur der mittleren Entropie. Math. Z., 78:33{43, 1962.

[68] K. Jacobs. The ergodic decomposition of the Kolmogorov-Sinai invariant. In F. B. Wright and F. B. Wright, editors, Ergodic Theory. Academic Press, New York, 1963.

[69] N. S. Jayant and P. Noll. Digital Coding of Waveforms. Prentice-Hall, Englewood Clifis,New Jersey, 1984.

[70] T. Kadota. Generalization of feinstein’s fundamental lemma. IEEE Trans. Inform. Theory, IT-16:791{792, 1970.

[71] S. Kakutani. Induced measure preserving transformations. In Proceedings of the Imperial Academy of Tokyo, volume 19, pages 635{641, 1943.

[72] A. J. Khinchine. The entropy concept in probability theory. Uspekhi Matematicheskikh Nauk., 8:3{20, 1953. Translated in Mathematical Foundations of Information Theory,Dover New York (1957).

[73] A. J. Khinchine. On the fundamental theorems of information theory.

Uspekhi Matematicheskikh Nauk., 11:17{75, 1957. Translated in Mathematical Foundations of Information Theory,Dover New York (1957).

[74] J. C. Kiefier. A counterexample to Perez’s generalization of the ShannonMcMillan theorem. Ann. Probab., 1:362{364, 1973.

[75] J. C. Kiefier. A general formula for the capacity of stationary nonanticipatory channels. Inform. and Control, 26:381{391, 1974.

280

BIBLIOGRAPHY

[76]J. C. Kiefier. On the optimum average distortion attainable by flxed-rate coding of a nonergodic source. IEEE Trans. Inform. Theory, IT-21:190{ 193, March 1975.

[77]J. C. Kiefier. A generalization of the pursley-davisson-mackenthun universal variable-rate coding theorem. IEEE Trans. Inform. Theory, IT23:694{697, 1977.

[78]J. C. Kiefier. A unifled approach to weak universal source coding. IEEE Trans. Inform. Theory, IT-24:674{682, 1978.

[79]J. C. Kiefier. Extension of source coding theorems for block codes to sliding block codes. IEEE Trans. Inform. Theory, IT-26:679{692, 1980.

[80]J. C. Kiefier. Block coding for weakly continuous channels. IEEE Trans. Inform. Theory, IT-27:721{727, 1981.

[81]J. C. Kiefier. Sliding-block coding for weakly continuous channels. IEEE Trans. Inform. Theory, IT-28:2{10, 1982.

[82]J. C. Kiefier. Coding theorem with strong converse for block source coding subject to a fldelity constraint, 1989. Preprint.

[83]J. C. Kiefier. An ergodic theorem for constrained sequences of functions.

Bulletin American Math Society, 1989.

[84]J. C. Kiefier. Sample converses in source coding theory, 1989. Preprint.

[85]J. C. Kiefier. Elementary information theory. Unpublished manuscript, 1990.

[86]J. C. Kiefier and M. Rahe. Markov channels are asymptotically mean stationary. Siam Journal of Mathematical Analysis, 12:293{305, 1980.

[87]A. N. Kolmogorov. On the Shannon theory of information in the case of continuous signals. IRE Transactions Inform. Theory, IT-2:102{108, 1956.

[88]A. N. Kolmogorov. A new metric invariant of transitive dynamic systems and automorphisms in lebesgue spaces. Dokl. Akad. Nauk SSR, 119:861{ 864, 1958. (In Russian.).

[89]A. N. Kolmogorov. On the entropy per unit time as a metric invariant of automorphisms. Dokl. Akad. Naud SSSR, 124:768{771, 1959. (In Russian.).

[90]A. N. Kolmogorov, A. M. Yaglom, and I. M. Gelfand. Quantity of information and entropy for continuous distributions. In Proceedings 3rd All-Union Mat. Conf., volume 3, pages 300{320. Izd. Akad. Nauk. SSSR, 1956.

BIBLIOGRAPHY

281

[91]S. Kullback. A lower bound for discrimination in terms of variation. IEEE Trans. Inform. Theory, IT-13:126{127, 1967.

[92]S. Kullback. Information Theory and Statistics. Dover, New York, 1968. Reprint of 1959 edition published by Wiley.

[93]B. M. Leiner and R. M. Gray. Bounds on rate-distortion functions for stationary sources and context-dependent fldelity criteria. IEEE Trans. Inform. Theory, IT-19:706{708, Sept. 1973.

[94]V. I. Levenshtein. Binary codes capable of correcting deletions, insertions,and reversals. Sov. Phys.-Dokl., 10:707{710, 1966.

[95]S. Lin. Introduction to Error Correcting Codes. Prentice-Hall, Englewood Clifis,NJ, 1970.

[96]K. M. Mackenthun and M. B. Pursley. Strongly and weakly universal source coding. In Proceedings of the 1977 Conference on Information Science and Systems, pages 286{291, Johns Hopkins University, 1977.

[97]F. J. MacWilliams and N. J. A. Sloane. The Theory of Error-Correcting Codes. North-Holland, New York, 1977.

[98]A. Maitra. Integral representations of invariant measures. Transactions of the American Mathematical Society, 228:209{235, 1977.

[99]J. Makhoul, S. Roucos, and H. Gish. Vector quantization in speech coding. Proc. IEEE, 73. No. 11:1551{1587, November 1985.

[100]B. Marcus. Sophic systems and encoding data. IEEE Trans. Inform. Theory, IT-31:366{377, 1985.

[101]K. Marton. On the rate distortion function of stationary sources. Problems of Control and Information Theory, 4:289{297, 1975.

[102]R. McEliece. The Theory of Information and Coding. Cambridge University Press, New York, NY, 1984.

[103]B. McMillan. The basic theorems of information theory. Ann. of Math. Statist., 24:196{219, 1953.

[104]L. D. Meshalkin. A case of isomorphisms of bernoulli scheme. Dokl. Akad. Nauk SSSR, 128:41{44, 1959. (In Russian.).

[105]Shu-Teh C. Moy. Generalizations of Shannon-McMillan theorem. Paciflc Journal Math., 11:705{714, 1961.

[106]J. Nedoma. On the ergodicity and r-ergodicity of stationary probability measures. Z. Wahrsch. Verw. Gebiete, 2:90{97, 1963.

282

BIBLIOGRAPHY

[107]J. Nedoma. The synchronization for ergodic channels. Transactions Third Prague Conf. Information Theory, Stat. Decision Functions,and Random Processes, pages 529{539, 1964.

[108]D. L. Neuhofi and R. K. Gilbert. Causal source codes. IEEE Trans. Inform. Theory, IT-28:701{713, 1982.

[109]D. L. Neuhofi, R. M. Gray, and L. D. Davisson. Fixed rate universal block source coding with a fldelity criterion. IEEE Trans. Inform. Theory, 21:511{523, 1975.

[110]D. L. Neuhofi and P. C. Shields. Channels with almost flnite memory.

IEEE Trans. Inform. Theory, pages 440{447, 1979.

[111]D. L. Neuhofi and P. C. Shields. Channel distances and exact representation. Inform. and Control, 55(1), 1982.

[112]D. L. Neuhofi and P. C. Shields. Channel entropy and primitive approximation. Ann. Probab., 10(1):188{198, 1982.

[113]D. L. Neuhofi and P. C. Shields. Indecomposable flnite state channels and primitive approximation. IEEE Trans. Inform. Theory, IT-28:11{19, 1982.

[114]D. Ornstein. Bernoulli shifts with the same entropy are isomorphic. Advances in Math., 4:337{352, 1970.

[115]D. Ornstein. An application of ergodic theory to probability theory. Ann. Probab., 1:43{58, 1973.

[116]D. Ornstein. Ergodic Theory,Randomness,and Dynamical Systems. Yale University Press, New Haven, 1975.

[117]D. Ornstein and B. Weiss. The Shannon-McMillan-Breiman theorem for a class of amenable groups. Israel J. of Math, 44:53{60, 1983.

[118]D. O’Shaughnessy. Speech Communication. Addison-Wesley, Reading, Mass., 1987.

[119]P. Papantoni-Kazakos and R. M. Gray. Robustness of estimators on stationary observations. Ann. Probab., 7:989{1002, Dec. 1979.

[120]A. Perez. Notions g¶en¶eralisees d’incertitude,d’entropie et d’information du point de vue de la th¶eorie des martingales. In Transactions First Prague Conf. on Information Theory, Stat. Decision Functions,and Random Processes, pages 183{208. Czech. Acad. Sci. Publishing House, 1957.

[121]A. Perez. Sur la convergence des incertitudes,entropies et informations ¶echantillon vers leurs valeurs vraies. In Transactions First Prague Conf. on Information Theory, Stat. Decision Functions,and Random Processes, pages 245{252. Czech. Acad. Sci. Publishing House, 1957.

BIBLIOGRAPHY

283

[122]A. Perez. Sur la th¶eorie de l’information dans le cas d’un alphabet abstrait. In Transactions First Prague Conf. on Information Theory, Stat. Decision Functions,Random Processes, pages 209{244. Czech. Acad. Sci. Publishing House, 1957.

[123]A. Perez. Extensions of Shannon-McMillan’s limit theorem to more general stochastic processes processes. In Third Prague Conf. on Inform. Theory,Decision Functions,and Random Processes, pages 545{574, Prague and New York, 1964. Publishing House Czech. Akad. Sci. and Academic Press.

[124]K. Petersen. Ergodic Theory. Cambridge University Press, Cambridge, 1983.

[125]M. S. Pinsker. Dynamical systems with completely positive or zero entropy. Soviet Math. Dokl., 1:937{938, 1960.

[126]D. Ramachandran. Perfect Measures. ISI Lecture Notes,No. 6 and 7. Indian Statistical Institute, Calcutta,India, 1979.

[127]V. A. Rohlin and Ya. G. Sinai. Construction and properties of invariant measurable partitions. Soviet Math. Dokl., 2:1611{1614, 1962.

[128]V. V. Sazanov. On perfect measures. Izv. Akad. Nauk SSSR, 26:391{414, 1962. American Math. Soc. Translations,Series 2, No. 48,pp. 229-254,1965.

[129]C. E. Shannon. A mathematical theory of communication. Bell Syst. Tech. J., 27:379{423,623{656, 1948.

[130]C. E. Shannon. Coding theorems for a discrete source with a fldelity criterion. In IRE National Convention Record,Part 4, pages 142{163, 1959.

[131]P. C. Shields. The Theory of Bernoulli Shifts. The University of Chicago Press, Chicago,Ill., 1973.

[132]P. C. Shields. The ergodic and entropy theorems revisited. IEEE Trans. Inform. Theory, IT-33:263{266, 1987.

[133]P. C. Shields and D. L. Neuhofi. Block and sliding-block source coding.

IEEE Trans. Inform. Theory, IT-23:211{215, 1977.

[134]Ya. G. Sinai. On the concept of entropy of a dynamical system. Dokl. Akad. Nauk. SSSR, 124:768{771, 1959. (In Russian.).

[135]Ya. G. Sinai. Weak isomorphism of transformations with an invariant measure. Soviet Math. Dokl., 3:1725{1729, 1962.

[136]Ya. G. Sinai. Introduction to Ergodic Theory. Mathematical Notes,Princeton University Press, Princeton, 1976.

284

BIBLIOGRAPHY

[137]D. Slepian. A class of binary signaling alphabets. Bell Syst. Tech. J., 35:203{234, 1956.

[138]D. Slepian, editor. Key Papers in the Development of Information Theory. IEEE Press, New York, 1973.

[139]A. D. Sokai. Existence of compatible families of proper regular conditional probabilities. Z. Wahrsch. Verw. Gebiete, 56:537{548, 1981.

[140]J. Storer. Data Compression. Computer Science Press, Rockville, Maryland, 1988.

[141]I. Vajda. A synchronization method for totally ergodic channels. In Transactions of the Fourth Prague Conf. on Information Theory,Decision Functions,and Random Processes, pages 611{625, Prague, 1965.

[142]E. van der Meulen. A survey of multi-way channels in information theory: 1961{1976. IEEE Trans. Inform. Theory, IT-23:1{37, 1977.

[143]S. R. S. Varadhan. Large Deviations and Applications. Society for Industrial and Applied Mathematics, Philadelphia, 1984.

[144]L. N. Vasershtein. Markov processes on countable product space describing large systems of automata. Problemy Peredachi Informatsii, 5:64{73, 1969.

[145]A. J. Viterbi and J. K. Omura. Principles of Digital Communication and Coding. McGraw-Hill, New York, 1979.

[146]J. von Neumann. Zur operatorenmethode in der klassischen mechanik. Ann. of Math., 33:587{642, 1932.

[147]P. Walters. Ergodic Theory-Introductory Lectures. Lecture Notes in Mathematics No. 458. Springer-Verlag, New York, 1975.

[148]E. J. Weldon, Jr. and W. W. Peterson. Error Correcting Codes. MIT Press, Cambridge, Mass., 1971. Second Ed.

[149]K. Winkelbauer. Communication channels with flnite past history. Transactions of the Second Prague Conf. on Information Theory,Decision Functions,and Random Processes, pages 685{831, 1960.

[150]J. Wolfowitz. Strong converse of the coding theorem for the general discrete flnite-memory channel. Inform. and Control, 3:89{93, 1960.

[151]J. Wolfowitz. Coding Theorems of Information Theory. Springer-Verlag, New York, 1978. Third edition.

[152]A. Wyner. A deflnition of conditional mutual information for arbitrary ensembles. Inform. and Control, pages 51{59, 1978.