
An Introduction to Statistical Signal Processing
.pdf
4.19. PROBLEMS |
265 |
24.We developed the mean and variance of the sample average Sn for the special case of uncorrelated random variables. Evaluate the mean
and variance of Sn for the opposite extreme, where the Xi are highly correlated in the sense that E[XiXk] = E[Xi2] for all i, k.
25.Given n independent random variables Xi, i = 1, 2, . . . , n with variances σi2 and means mi. Define the random variable
n
Y = aiXi .
i=1
where the ai are fixed real constants. Find the mean, variance, and characteristic function of Y .
Now let the mean be constant; i.e., mi = m. Find the minimum variance of Y over the choice of the {ai} subject to the constraint that EY = m. The result is called the minimum variance unbiased estimate of m.
Now suppose that {Xi; i = 0, 1, . . . } is an iid random process and that N is a Poisson random variable with parameter λ and that N is independent of the {Xi}. Define the random variable
Y = N Xi . σ2
i=1 X
Use iterated expectation to find the mean, variance, and characteristic function of Y .
26.Let the random process of example [3.27] can be expressed as follows: Let Θ be a continuous random variable with a pdf
21π ; θ [−π, +π]
and define the process {X(t); t } by
X(t) = cos(t + Θ) .
(a)Find the cdf FX(0)(x).
(b)Find EX(t).
(c)Find the covariance function KX(t, s).
27.Let {Xn} be a random process with mean m and autocorrelation function RX(n, k), and let {Wn} be an iid random process with zero mean
266 |
CHAPTER 4. EXPECTATION AND AVERAGES |
and variance σW2 . Assume that the two processes are independent of each another; that is, any collection of the Xi is independent of any collection of the Wi. Form a new random process Yn = Xn + Wn. Note: This is a common model for a communication system or measurement system with {Xn} a “signal” process or “source,” {Wn} a “noise” process, and {Yn} the “received” process; see problem 3.30 for example.
(a)Find the mean EYn and covariance KY (t, s) in terms of the given parameters.
(b)Find the cross-correlation function defined by
RXY (k, j) = E[XkYj] .
(c)As in exercise 4.23, find the minimum mean squared error estimate of Xn of the form
2 n) = aYn + b . X(Y
The resulting estimate is called a filtered value of Xn.
(d)Extend to a linear filtered estimate that uses Yn and Yn−1.
28.Suppose that there are two independent data sources {Wi(n), i = 1, 2}. Each data source is modeled as a Bernoulli random process with parameter 1/2. The two sources are encoded for transmission as follows: First, three random processes {Yi(n); i = 1, 2, 3} are formed, where Y1 = W1, Y2 = W2, Y3 = W1 + W2, and where the last sum is taken modulo 2 and is formed to provide redundancy for noise protection in transmission. These are time-multiplexed to form a random process {X(3n + i) = Yi(n)}. Show that {X(n)} has identically distributed components and is pairwise independent but is not iid.
29.Let {Un; n = 0, 1, . . . , } be an iid random process with marginal pdf fUn = fU , the uniform pdf of Problem A.1. In other words, the joint pdf’s can be written as
|
n−1 |
fUn (un) = fU0,U1,... ,Un−1 (u0, u1, . . . , un−1) = |
i |
fU (ui). |
|
|
=0 |
Find the mean mn = E[Un] and covariance function KU (k, j) = E[(Uk − mk)(Uj − mj)] for the process and verify that the weak law of large numbers holds for this process.
268 |
CHAPTER 4. EXPECTATION AND AVERAGES |
33.Suppose that X is a random variable with mean m and variance σ2. Let gk be a deterministic periodic pulse train such that Gk is 1 whenever k is a multiple of a fixed positive integer N and gk is 0 for all other k. Let U be a random variable that is independent of X such that pU (u) = 1/N for u = 0, 1, . . . , N − 1. Define the random process Yn by
Yn = XgU+n
that is, Yn looks like a periodic pulse train with a randomly selected amplitude and a randomly selected phase. Find the mean and co-
variance functions of the Y process. Find a random variable Y such |
|||
that |
2 |
||
|
1 n−1 |
2 |
|
lim |
|
i |
|
|
|
Yi = Y |
|
n→∞ n =0 |
|
in the sense of convergence with probability one. (This is an example of a process that is simple enough for the limit to be evaluated explicitly.) Under what conditions on the distribution of X does the limit equal EY0 (and hence the conclusion of the weak law of large numbers holds for this process with memory)?
34.Let {Xn} be an iid zero-mean Gaussian random process with autocorrelation function RX(0) = σ2. Let {Un} be an iid random process with Pr(Un = 1) = P r(Un = −1) = 1/2. Assume that the two processes are mutually independent of each other. Define a new random process {Yn} by
Yn = UnXn .
(a)Find the autocorrelation function RY (k, j).
(b)Find the characteristic function MYn (ju).
(c)Is {Yn} an iid process?
(d)Does the sample average
n−1
Sn = n−1 Yi
i=0
converge in mean square. If so, to what?
35.Assume that {Xn} is an iid zero-mean Gaussian random process with RX(0) = σ2, that {Un} is an iid binary random process with Pr(Un = 1) = 1− and Pr(Un = 0) = (in other words, {Un} is a Bernoulli

4.19. PROBLEMS |
269 |
process with parameter 1− ), and the processes {Xn} and {Un} are mutually independent of each another. Define a new random process
Vn = XnUn .
(This is a model for the output of a communication channel that has the X process as an input but has “dropouts — that is, occasionally sets an input symbol to zero.)
(a) Find the mean EV |
n |
and characteristic function M |
Vn |
)ki EejuVn . |
|||||||
|
|
|
|
|
|
|
|
= |
|||
(b) Find the mean squared error E[(Xn − Vn)2]. |
|
|
|
||||||||
(c) Find P r(Xn = Vn). |
|
|
|
|
|
|
|
|
|||
(d) Find the covariance of Vn. |
|
|
|
|
|
|
|||||
(e) Is the following true? |
|
|
|
|
|
|
|
||||
|
1 n−1 |
Xi |
|
1 n−1 |
Ui = nlim |
|
1 n−1 |
||||
nlim |
|
i |
nlim |
|
|
|
|
||||
n |
n |
|
n |
Vi |
|||||||
→∞ |
=0 |
|
|
→∞ |
|
i=0 |
→∞ |
|
i=0 |
36.Show that convergence in distribution is implied by the other three forms of convergence.
37.Let {Xn} be a finite-alphabet iid random process with marginal pmf px. The entropy of an iid random process is defined as
H(X) = − px(x) log pX(x) = E(− log pX(X)) ,
x
where care must be taken to distinguish the use of the symbol X to mean the name of the random variable in H(X) and pX and its use as the random variable itself in the argument of the left-hand expression. If the logarithm is base two then the units of entropy are called bits. Use the weak law of large numbers to show that
1 n−1 →
−n i=0 log pX(Xi) n → ∞
in the sense of convergence in probability. Show that this implies that
lim Pr(|pX0,... ,Xn−1 (X0, . . . , Xn−1) − 2nH(X)| > *) = 0
n→∞
for any * > 0. This result was first developed by Claude Shannon and is sometimes called the asymptotic equipartition property of information theory. It forms one of the fundamental results of the mathematical theory of communication. Roughly stated, with high probability
270 |
CHAPTER 4. EXPECTATION AND AVERAGES |
an iid process with produce for large n and n−dimensional sample vector Xn = (x0, x1, . . . , xn−1) such that the nth order probability mass function evaluated at xn is approximately 2−nH(X) ; that is, the process produces long vectors that appear to have an approximately uniform distribution over some collection of possible vectors.
38. Suppose that {Xn} is a discrete time iid random process with uniform
marginal pdf’s |
|
|
|
|
f |
|
(α) = |
1 |
0 ≤ α < 1 |
|
Xn |
|
0 |
otherwise. |
Does the sequence of random variables |
||||
|
|
|
|
n |
|
|
|
i |
|
|
|
Zn = |
Xi |
|
|
|
|
|
=1 |
converge in probability? If so, to what?
39.The conditional di erential entropy of Xn−1 given Xn−1 = (X0, X1, . . . , Xn−2) is defined by
h=(Xn−1|Xn−1)
−fX0,X1,... ,Xn−1 (x0, x1, . . . , xn−1) ×
log fXn−1|X1,... ,Xn−2 (xn−1|x1, . . . , xn−2) dx0 dx1 · · · dxn−1
|
(4.136) |
Show that |
|
h(Xn) = h(Xn|Xn−1) + h(Xn−1). |
(4.137) |
Now suppose that {Xn} is a stationary Gaussian random process with zero mean and covariance function K. Evaluate h(Xn|Xn−1).
40.Let X ≥ 0 be an integer valued random variable with E(X) < ∞.
(a)Prove that
∞
E(X) = P (X ≥ k)
k=1
(b) Based on (a) argue that
lim P (X ≥ N) = 0
N→∞

4.19. PROBLEMS |
271 |
(c) Prove the stronger statement
P (X ≥ N) ≤ E(X)
N
Hint: Write an expression for the expectation E(X) and break up the sum into two parts, a portion where the summation dummy variable is larger than N and a portion where it is smaller. A simple lower bound for each part gives the desired result.
(d)Let X be a geometric random variable with parameter p, p = 0. Calculate the quantity P (X ≥ N) and use this result to show that actually limN→∞ P (X ≥ N) = 0.
(e)Based on the previous parts show that
|
|
|
|
|
(1 − p)N−1 |
1 |
|
|
|
|
|
|
|
|
≤ |
|
|
|
|
|
|
|
|
|
pN |
|
|||
for any 0 < p ≤ 1 and for any integer N. |
¯ |
||||||||
41. Suppose that |
|
X |
|
iid random process with mean E(X |
|||||
{ |
n} is an |
) = X |
|||||||
|
|
2 |
2 |
|
|
n |
|
− ¯ { }
and variance E[(Xn X) ] = σX. A new process Yn is defined by
the relation
∞
Yn = rkXn−k
k=0
where |r| < 1. Find E(Yn) and the autocorrelation RY (k, j) and the covariance KY (k, j).
Define the sample average
1 n−1
Sn = n i=0 Yi.
Find the mean E(Sn) and variance σS2n . Does Sn → 0 in probability?
42.Let {Un} be an iid Gaussian random process with mean 0 and variance σ2. Suppose that Z is a random variable having a uniform distribution on [0, 1]. Suppose Z represents the value of a measurement taken by a remote sensor and that we wish to guess the value of Z based on a noise sequence of measurements Yn = Z + Un, n = 0, 1, 2, . . . , that is, we observe only Yn and wish to estimate the underlying value of Z. To do this we form a sample average and define the estimate
2 |
1 |
n−1 |
|
i |
|
ZN = |
|
Yi . |
|
N |
=0 |
|
|

272 CHAPTER 4. EXPECTATION AND AVERAGES
(a) Find a simple upper bound to the probability
| 2 − |
Pr( Zn Z > *)
that goes to zero as n → ∞. (This means that our estimator is asymptotically good.)
Suppose next that we have a two-dimensional random process {Un, Wn} (i.e., the output at each time is a random pair or a two-dimensional random variable) with the following properties: Each pair (Un, Wn) is independent of all past and future pairs (Uk, Wk) k = n. Each pair (Un, Wn) has an iden-
tical joint cdf FU,W (u, w). For each n EUn = EWn = 0, E(u2n) = E(Wn2) = σ2, and E(UnWn) = ρσ2. (The quantity ρ is called the correlation coe cient.) Instead of just observing
a noisy sequence Yn = Z + Un, we also observe a separate noisy measurement sequence Xn = Z + Wn (the same Z, but di erent noises). Suppose further that we try to improve our estimate of Z by using both of these measurements to form an estimate
˜ |
1 n−1 |
|
1 n−1 |
||
|
|
i |
|
|
|
|
|
|
|||
Z = a n =0 |
Yi + (1 − a)n i=0 Xi . |
for some a in [0, 1].
(b) Show that |ρ| ≤ 1. Find a simple upper bound to the probability
| ˜ − |
Pr( Zn Z > *)
that goes to zero as n → ∞. What value of a gives the smallest upper bound in part (b) and what is the resulting bound? (Note as a check that the bound should be no worse than part (a) since the estimator of part (a) is a special case of that of part (b).) In the special case where ρ = −1, what is the best a and what is the resulting bound?
43.Suppose that {Xn} are iid random variables described by a common marginal distribution F . Suppose that the random variables
1 n−1
Sn = n i=0 Xi
also have the distribution F for all positive integers n. Find the form of the distribution F . (This is an example of what is called a stable

4.19. PROBLEMS |
273 |
distribution. Suppose that the 1/2 in the definition of Sn is replaced
√
by 1/ n. What must F then be?
44. Consider the following nonlinear modulation scheme: Define
W (t) = ej(2πf0t+cX(t)+Θ),
{X(t)} is a weakly stationary Gaussian random process with autocorrelation function RX(τ), f0 is a fixed frequency, Θ is a uniform random variable on [0, 2π], Θ is independent of all of the X(t), and c is a modulation constant. (This is a mathematical model for phase modulation.)
Define the expectation of a complex random variable in the natural way, that is, if Z = (Z)+j*(Z), then E(Z) = E[ (Z)]+jE[*(Z)].) Define the autocorrelation of a complex valued random process W (t)
by
RW (t, s) = E(W (t)W (s) ),
where W (s) denotes the complex conjugate of W (s).
Find the mean E(W (t)) and the autocorrelation function RW (t, s) = E[W (t)W (s) ].
Hint: The autocorrelation is admittedly a trick question (but a very useful trick). Keep characteristic functions in mind when pondering the evaluation of the autocorrelation function.
45.Suppose that {Xn; n = 0, 1, · · · } is a discrete time iid random process with pmf
pXn (k) = 1/2; k = 0, 1.
Two other random processes are defined in terms of the X process:
n
Yn = Xi; n = 0, · · ·
i=0
Wn = (−1)Yn n = 0, 1, · · · .
and
Vn = Xn − Xn−1; n = 1, . . . .
(a)Find the covariance functions for the X and Y processes.
(b)Find the mean and variance of the random variable Wn. Find the covariance function of the process Wn.
(c)Find the characteristic function of the random variable Vn.

274 |
CHAPTER 4. EXPECTATION AND AVERAGES |
(d) Which of the above four processes are weakly stationary? Which are not?
(e) Evaluate the following limits:
i. l.i.m.n→∞ nY+1n .
ii. l.i.m.n→∞ Yn .
n2 n
iii. l.i.m.n→∞ 1 l=1 Vl.
n
iv.For the showo s: Does the last limit above converge with probability one? (Only elementary arguments are needed.)
46.Suppose that {Xn} is a discrete time iid random process with uniform
marginal pdf’s |
|
|
|
|
f |
|
(α) = |
1 |
0 ≤ α < 1 |
|
Xn |
|
0 |
otherwise. |
Define the following random variables:
•U = X02
•V = max(X1, X2, X3, X4)
• |
W = |
1 |
if X1 ≥ 2X2 |
|
|
0 |
otherwise |
• For each integer n
Yn = Xn + Xn−1.
Note that this defines a new random process {Yn}.
(a)Find the expected values of the random variables U, V , and W .
(b)What are the mean E(Xn) and covariance function KX(k, j) of
{Xn}?
(c)What are the mean E(Yn) and covariance function KY (k, j) of
{Yn}?
(d)Define the sample average
1 |
n |
|
Sn = |
|
|
|
Yk. |
|
n |
k=1 |
|
|
|
Find the mean E(Sn) and variance σS2n of Sn. Using only these results (and no results not yet covered in class), find l.i.m.n→∞Sn.