Добавил:
Upload Опубликованный материал нарушает ваши авторские права? Сообщите нам.
Вуз: Предмет: Файл:
Готов_ПТИ англ.docx
Скачиваний:
0
Добавлен:
01.07.2025
Размер:
1.65 Mб
Скачать

2.Solution of standard examples

Example 2.1. On a binary symmetric communication channel with hindrances two signals x1 and x2 with aprioristic probabilities P (x1) =3/4 and P (x2) =1/4 are transferred. Because of presence of hindrances the probability of correct reception of each of signals decreases to 7/8. Duration of one signal. It is required to define:

1) productivity and redundancy of a source;

2) speed of information transfer and bandwidth of a communication channel.

The decision. We will define a unit of measure of quantity of the information as or as well as we will use results of the decision of an example 2.2.1 in which are received:

• conditional probabilities P (yj/xi) = reception of messages y1, y2 under condition of message transfer x1, x2

, , ,

• quantity of the information on a channel input counting on one message

или ;

• average quantity of the mutual information I (Y, X) =IXY counting on one message

; .

Let's calculate on their basis information characteristics of a source and a communication channel:

1) it agree (2), productivity of a source

; .

2) it agree (1), redundancy of a source at the maximum quantity of its information

.

;

3) it agree (3), speed of information transfer

; .

4) it agree (6), at probability of an error or Bandwidth of channel a signal

Also makes on one signal, and Bandwidth in unit of time

; .

Comparison C and vIX shows that bandwidth of the given channel doesn't provide information transfer with as much as small probability of an error by noiseproof coding, as vIX > C (according to the theorem of Shannon).

Example 2.2. Number of symbols of the alphabet of a source (or ). Probabilities of occurrence of symbols of a source

, , и .

Between the next symbols there are correlation communications which are described at a matrix of conditional probabilities P (xi/xj) = :

For example

It is required to define redundancy of source R1 at statistical independence of symbols and R2 at the dependence account between symbols.

The decision. For a redundancy estimation it is necessary to find unconditional entropy H1 (X) and conditional entropy H2 (X/X) a source.

In a case not the account of statistical communication on the basis of (1.1) for H1 (X) it is had

; .

Taking into account correlation communication on the basis of (5) or (7)

; .

The greatest possible entropy of a source with four symbols is defined by a measure of Hartley

; .

Hence,

.

;

.

;

Example 2.3. On a communication channel the ensemble of 3 signals xi, with duration and frequency of following is transferred. The source of signals has matrix P(X) =Px unconditional probabilities

.

;

The communication channel is characterized at , , , and a matrix of conditional probabilities P(yj/xi) =, where yj, ensemble of signals on a channel exit (i.e. the receiver),

.

;

To define bandwidth of signal. To compare productivity of a source and bandwidth of signal.

The decision. On a statement of the problem speed vX creations of signals and speed vK their transfers on the channel are equal, i.e. vX=vK. These speeds correspond to frequency of following of signals, i.e.

и .

According to definition (5), bandwidth

C = vKmax {I (Y, X)} =Fmax {I (Y, X)},

Where the maximum is searched on all distributions of probabilities P(X) and P(Y).

Let's find unconditional probabilities P (yi):

.

Hence,

;

;

.

According to (1), unconditional entropy H (Y) a target signal

; .

On the basis of (7) taking into account P(xi, yj) =P (xi) P(yj/xi) conditional entropy H(Y/X) target signal Y concerning input X

. .

Opening a sum sign at, we have

.

As the sum of probabilities conditional entropy becomes

Also doesn't depend from the statistical of entrance and target signals. It is completely defined by parameters of a channel matrix.

According to (3), quantity of the information on a communication channel exit

I (X, Y) =H (Y) - H (Y/X)

It will be maximum at a maximum of entropy of receiver H (Y). Entropy H (Y) is maximum in case of equal probability of signals on a channel exit, i.e. when at number of signals of their probability

; ; ;

In this case entropy of target signals of the channel corresponds to a measure of Hartley and is equal lnN, i.e.

; .

Thus, the information maximum quantity on the communication channel exit, defined as I(X,Y)max=H (Y) max - H (Y/X), will be

.

Bandwidth of channel

Also makes .

According to (1), unconditional entropy H (X) entrance signal

; .

Thus, according to (3.2) and (2.2), productivity vI (X) source

Also makes

As vI (X)> C the communication channel can't be used for information transfer from the given source.

Example 2.4. To define the greatest possible speed of information transfer on a radio engineering communication channel of point of management with the long - distance rocket if the communication channel pass - band is equal, and the minimum relation a signal/noise on capacity x=Px/P in the course of rocket prompting on the purpose .

On the basis of (5.8) Bandwidth of the given continuous channel

Also makes.