Добавил:
Upload Опубликованный материал нарушает ваши авторские права? Сообщите нам.
Вуз: Предмет: Файл:
передача информации.docx
Скачиваний:
2
Добавлен:
01.05.2025
Размер:
281.64 Кб
Скачать

2. Information characteristics of sources, messages and communication channels.

One of the basic tasks of the theory of the information transmitting consists in development of methods of calculation of these characteristics.

Let's consider features of the decision of this task for a case of the discrete messages transmitting as more simple.

Let's designate volume of the alphabet A of the discrete messages source through m. Let's remind, that as the alphabet was named symbols set (final set), which uses for formation of the messages. Let's assume, that each message includes of n-symbols. Let's show, as determine the information quantity in the messages of such source.

It would be possible to use common number

N0 = m n (2.1)

of the messages, length n, as the information characteristic of the messages source, but it is inconvenient because of indicative dependence N0 from n.

In 1928 R. Hartley has offered to take logarithm from this dependence and to use a logarithmic measure of the information quantity

I=log N0 = n log m. (2.2)

The formula (10.2) does not reflect random character of the messages formation. To remove this lack, it is necessary to connect the information quantity with probabilities of occurrence of symbols. This task in 1946 has decided C.Shannon. The decision is executed so. If the probabilities of occurrence of all symbols of the alphabet are identical, the information quantity, which transfers one symbol, I1= logm. Probability of occurrence of symbols р = 1/m, hence, m =1/р. Having substituted this meaning in the formula for I1 we shall receive I1 = log р. The received parity already connects the information quantity, which transfers one symbol, and probability of occurrence of this symbol. In the real messages the symbols have different probability. Let's designate through p(ai) probability of occurrence in the message of a symbol ai, ai A. Then the information quantity, which transfers this symbol, Ii = log p(ai).

The source entropy as the average information quantity, which contents one symbol of the messages source, we shall receive, applying operation of averaging on all volume of the alphabet:

, bit/symb.

Conditional source entropy.

To take into account statistical (correlation) connections between symbols of the messages, they have proposed concept of conditional entropy

, (2.3)

where is probability of occurrence a'j provided that before it has appeared аi.

Conditional entropy is an average information quantity, which transfers one symbol of the messages provided that there are correlation connections between anyone by two next symbols of the message. Because of correlation connections between symbols and not of their equiprobable occurrence in the real messages the information quantity of decreases which transfers one symbol. Quantitatively these losses of the information characterize in redundancy factor

, (2.4)

where Н1 - the information quantity, which transfers one symbol in the real messages,

Н2, - maximum the information quantity, which can transfers one симв. For the European languages redundancy of the messages .