Добавил:
Upload Опубликованный материал нарушает ваши авторские права? Сообщите нам.
Вуз: Предмет: Файл:

Ohrimenko+ / Ross S. A First Course in Probability_ Solutions Manual

.pdf
Скачиваний:
69
Добавлен:
11.06.2015
Размер:
13.47 Mб
Скачать

Since X1 + … + Xn is binomial with parameters n, x, we have

 

 

 

X

1

+... +

X

n

 

n

n

 

 

E

f

 

 

 

 

= f (k / n) xk (1

x)nk

 

 

 

 

 

 

 

 

 

 

 

n

 

 

 

k =1

k

 

 

The result now follows from Exercise 4.

 

 

 

k

 

 

 

 

 

 

 

6.

E[X] = i P{X = i} +

i P{X = i}

 

 

 

i=1

 

 

 

 

i=k +1

 

 

 

k

i P{X = k}

i=1

=P{X = k}k(k + 1)/2

k2 P{X = k} 2

7.Take logs and apply the central limit theorem

8.It is the distribution of the sum of t independent exponentials each having rate λ.

9.1/2

10.Use the Chernoff bound: etiM(t) = eλ(et 1)ti will obtain its minimal value when t is chosen to satisfy

λet = i, and this value of t is negative provided i < λ.

Hence, the Chernoff bound gives

P{X i} eiλ(λ/i)i

11.etiM(t) = (pet + q)neti and differentiation shows that the value of t that minimizes it is such that

t

t

 

t

 

iq

npe

= i(pe

+ q) or e

 

=

 

 

(n i) p

Using this value of t, the Chernoff bound gives that

iq

n

 

i

 

i

 

i

P{X i}

 

+ q

(n i)

 

p

 

/(iq)

 

 

 

 

 

n i

 

 

 

 

 

 

 

(nq)n (n i)i pi

 

 

 

 

 

 

 

=iiqi (n i)n

12.1 = E[eθX] eθE[X] by Jensen’s inequality. Hence, θE[X] 0 and thus θ > 0.

138

Chapter 8

Chapter 9

Problems and Theoretical Exercises

1.(a) P(2 arrivals in (0, s) 2 arrivals in (0, 1)}

=P{2 in (0, s), 0 in (s, 1)}/eλλ2/2)

= [eλs(λs)2/2][e(1s)λ]/(eλλ2/2) = s2 = 1/9 when s = 1/3

(b)1 P{both in last 40 minutes) = 1 (2/3)2 = 5/9

2.e3s/60

3.e3s/60 + (s/20)e3s/60

8.The equations for the limiting probabilities are:

c = .7c + .4s + .2g s = .2x + .3s + .4g g = .1c + .3s + .4g c + s + g = 1

and the solution is: c = 30/59, s = 16/59, g = 13/59. Hence, Buffy is cheerful 3000/59 percent of the time.

9.The Markov chain requires 4 states:

0 = RR = Rain today and rain yesterday

1 = RD = Dry today, rain yesterday

2 = DR = Rain today, dry yesterday

3 = DD = Dry today and dry yesterday with transition probability matrix

 

.8

.2

0

0

P =

0

0

.3

.7

 

.4

.6

0

0

 

0

0

.2

.8

Chapter 9

139

The equations for the limiting probabilities are:

0 = .80 + .42 1 = .20 + .62 2 = .31 + .23 3 = .71 + .83

0 + 1 + 2 + 3 = 1

which gives

0 = 4/15, 1 = 2 = 2/15, 3 = 7/15.

Since it rains today when the state is either 0 or 2 the probability is 2/5.

10.Let the state be the number of pairs of shoes at the door he leaves from in the morning. Suppose the present state is i, where i > 0. Now after his return it is equally likely that one door will have i and the other 5 i pairs as it is that one will have i 1 ant the other 6 i. Hence, since he is equally likely to choose either door when he leaves tomorrow it follows that

Pi,i = Pi,5i = Pi,i1 = Pi,6i = 1/4

provided all the states i, 5 i, i 1, 6 i are distinct. If they are not then the probabilities are added. From this it is easy to see that the transition matrix Pij, i, j = 0, 1, …, 5 is as follows:

 

1/ 2

0

0

0

0

1/ 2

 

1/ 4

1/ 4

0

0

1/ 4

1/ 4

P =

0

1/ 4

1/ 4

1/ 4

1/ 4

0

 

0

0

1/ 2

1/ 2

0

0

 

0

1/ 4

1/ 4

1/ 4

1/ 4

0

 

1/ 4

1/ 4

0

0

1/ 4

1/ 4

Since this chain is doubly stochastic (the column sums as well as the row sums all equal to one) it follows that i = 1/6, i = 0, …, 5, and thus he runs barefooted one-sixth of the time.

11.(b) 1/2

(c)Intuitively, they should be independent.

(d)From (b) and (c) the (limiting) number of molecules in urn 1 should have a binomial distribution with parameters (M, 1/2).

140

Chapter 9

F(x) =

Chapter 10

1.(a) After stage k the algorithm has generated a random permutation of 1, 2, …, k. It then puts element k + 1 in position k + 1; randomly chooses one of the positions 1, …, k + 1 and interchanges the element in that position with element k + 1.

(b)The first equality in the hint follows since the permutation given will be the permutation

after insertion of element k if the previous permutation is i1, …, ij1, i, ij, …, ik2 and the random choice of one of the k positions of this permutation results in the choice of position j.

2.Integrating the density function yields that that distribution function is

e2x / 2 , x > 0 1e2s / 2, x > 0

which yields that the inverse function is given by

F1(u) =

log(2u) / 2

if u <12

log(2[1u]) / 2

if u >1/ 2

Hence, we can simulate X from F by simulating a random number U and setting X = F 1(U).

3.The distribution function is given by

F(x) =

x2

/ 4 x +1, 2

x 3,

x x2 /12 2,

3 x 6

 

Hence, for u 1/4, F1(u) is the solution of

x2/4 x + 1 = u

that falls in the region 2 x 3. Similarly, for u 1/4, F1(u) is the solution of

x x2/12 2 = u

that falls in the region 3 x 6. We can now generate X from F by generating a random number U and setting X = F1(U).

4.Generate a random number U and then set X = F1(U). If U 1/2 then X = 6U 3, whereas if U 1/2 then X is obtained by solving the quadratic 1/2 + X 2/32 = U in the region 0 X 4.

Chapter 10

141

5.The inverse equation F1(U) = X is equivalent to

or

1 eαX β = U

X = {log(1 U)/α}1/β

Since 1 U has the same distribution as U we can generate from F by generating a random number U and setting X = {log(U)/α}1/β.

6.If λ(t) = ctn then the distribution function is given by

1 F(t) = exp{ktn+1}, t 0 where k = c/(n + 1)

Hence, using the inverse transform method we can generate a random number U and then set X such that

exp{kXn+1} = 1 U

or

X = {log(1 U)/k}1/(n+1)

Again U can be used for 1 U.

7.(a) The inverse transform method shows that U1/n works.

(b)P{MaxUi v} = P{U1 x, …, Un x}

=P{Ui x} by independence

=xn

(c)Simulate n random numbers and use the maximum value obtained.

8.(a) If Xi has distribution Fi , i = 1, …, n, then, assuming independence, F is the distribution of MaxXi. Hence, we can simulate from F by simulating Xi, i = 1, …, n and setting

X= MaxXi.

(b)Use the method of (a) replacing Max by Min throughout.

9.(a) Simulate Xi from Fi, i = 1, 2. Now generate a random number U and set X equal to X1 if

U< p and equal to X2 if U > p.

(b)Note that

F(x) = 13 F1(x) + 23 F2 (x)

where

F1(x) = 1 e3x, x > 0, F2(x) = x, 0 < x < 1

142

Chapter 10

Hence, using (a) let U1, U2, U3 be random numbers and set

X =

log(U1) / 3 if U3 <1/ 3

U2

if U3 >1/ 3

where the above uses that log(U1)/3 is exponential with rate 3.

10.With g(x) = λeλx

 

f (x)

 

 

 

2ex2 / 2

 

 

 

 

2

exp{[(x λ)

2

2

 

 

 

=

 

 

 

 

 

=

 

 

 

λ ]/ 2}

 

g(x)

 

 

1 / 2

e

λx

1 / 2

 

 

 

 

λ(2π)

 

 

 

 

 

λ(2π)

 

 

 

 

2eλ2 / 2

 

 

exp{(x λ)

2

/ 2}

 

 

 

 

 

 

1 / 2

 

 

 

 

 

λ(2π)

 

 

 

 

 

 

 

 

 

 

 

 

 

Hence, c = 2e

λ2

/ 2

 

1 / 2

]

and simple calculus shows that this is minimized when λ = 1.

 

 

 

/[λ(2π)

 

11.Calculus yields that the maximum value of f(x)/g(x) = 60x3(1 x)2 is attained when x = 3/5 and is thus equal to 1296/625. Hence, generate random numbers U1 and U2 and set X = U1 if U2 3125U13 (1U1)2 /108 . If not, repeat.

12.Generate random numbers U1, …, Un, and approximate the integral by [k(U1) + … + k(Un)]/n.

This works by the law of large numbers since E[k(U)] = 1 k(x)dx .

0

16.E[g(X)/f(X)] = [g(x) / f (x)] f (x)dx = g(x)dx

Chapter 10

143

Corrections to Ross, A FIRST COURSE IN PROBABILITY, seventh ed.

p. 79, line 2: P(nj =1 Rj ) P(Unj =1 Rj )

p. 95, centered eq. on line 12: Pn,m1,m Pn,m1

p. 288, l. 2: change “when j > r.” to “when j < 0.”

p. 309, first line of Example 8b: change “let Yi denote the selection ” to “let Yi

 

denote the selection ”

p. 373, line -8: on the centered equation following “the preceding equation yields” add a right paren at the very end. That is,

(1- (1- pi )n (1- (1- pi )n ))

p. 416, line 1: change “Let X1, . . . ,X n be independent” to “Let X1, . . . be independent”

p. 509, lines 6 and 7: 73. should be 83. and 74. should be 84.

p. 509, Solution to Problem 68 of Chapter 4: change

(1- e5 )80 to (1- e5 )10

Соседние файлы в папке Ohrimenko+