
- •2 The classical definition of probability.
- •3)Geometric definition of probability. The problem of meeting. Geometric Probability
- •4)Elements of the combinatory
- •5) A permutation. The number of permutations of n objects.
- •Example
- •6)A combination. The number of combinations of n distinct objects taken r at a time.
- •An Example of Combinations
- •7. Additive Rules (Addition formula of probability). 1) a1, a2, . . . , An are mutually exclusive. 2) a1, a2, . . . , An eny events.
- •8. Conditional probability.
- •9. Independence
- •Independent events
- •11. The Theorem of total probability.
- •12. Bayes’ Rule.
- •13. Bernoulli scheme. Bernoulli distribution.
- •14. Poisson approximation formula.
- •15. The Local Moivre-Laplace’s theorem
- •18. Independence of random variables.
- •21. Discrete random variable
- •22. Discrete Probability Distributions. Probability Density function.
- •23. Discrete Probability Distributions
- •24. Continuous distribution function
- •25. Continuous distribution function
- •Example
- •27. Joint Density Function
- •28. Conditional distribution
- •29. Statistical Independence
- •Independent events
- •30. Mathematical expectation
- •31. Mathematical expectation to the case of two random variables
- •32. Variance of random variables
- •33. Standard deviation.
- •35. Covariance of Random Variables
- •36. The correlation coefficient.
- •37. Means and Variances of Linear Combinations of Random Variables. We are still working towards finding the theoretical mean and variance of the sample mean:
- •Example
- •38. Chebyshev’s Theorem.
- •Example
- •39. Some Discrete Probability Distributions. Binomial and Multinomial Distributions.
- •40. Geometric Distribution.
- •41. Poisson Distribution.
- •42. Continuous Uniform Distribution. Normal Distribution.
- •43. Exponential Distributions.
- •44. Moments and Moment-Generating Functions.
- •45. Populations and Samples. Some Important Statistics.
- •46. Location Measures of a Sample: The Sample Mean, Median, and Mode.
- •1.Sample Mean.
- •Note that the statistics X(отрицание) assumes the value
- •47. The Sample Variance, Standard Deviation and Range.
- •48. The Central Limit Theorem.
- •49. The Likelihood Function.
- •50. Point estimate.
- •51. Estimating the Mean.
- •53. Single Sample: Estimating the Variance.
- •54. Sampling Distribution of s2.
- •55. Statistical Hypotheses: General Concepts.
- •56. Prove the formula of Poisson distribution:
49. The Likelihood Function.
n statistics,
a likelihood
function (often
simply the likelihood)
is a function of the parameters of
a statistical
model,
defined as follows: the likelihood of
a set of parameter values given some observed outcomes is equal to
the probability of
those observed outcomes given those parameter values, i.e.
Likelihood functions play a key role in statistical inference, especially methods of estimating a parameter from a set of statistics.
In non-technical parlance, "likelihood" is usually a synonym for "probability." But in statistical usage, a clear technical distinction is made depending on the roles of the outcome or parameter.
The
formal definition of likelihood function denote by x1,
x2,
…, xn
the
independent random variables taken from a discrete probability
distribution represented by f(x,
)
where
is a single parameter of distribution. Now we denote the L(x1,
x2,
…, xn,
)=
f(
x1,
x2,
…, xn,
.
This function L is the joint distribution of random variables often
preferred to us the likelihood function. If we denote x1,
x2,
…, xn
the observation value in a sample, in this case of a discrete random
variable the interpretation very clear.
Equality L(x1, x2, …, xn, )- the likelihood of the sample is the following joint probability P(X1= x1,X2= x2, …,Xn= xn, ) which is the probability of obtain the sample values x1, x2, …, xn..
Remark2. For discrete case a max likelihood estimator is one that result in a maximum value for this joint probability or maximize the likelihood of a sample.
Def3. Given independent observations x1, x2, …, xn from a probability density function (continuous case) or probability massfunction (discrete case).
f(x,
)
the maximum likelihood estimator
is that which maximize the likelihood function
L(x1, x2, …, xn, )= f(x, )= f(x1, ) f(x2, )…f(xn, ).
50. Point estimate.
Estimate problem
Remark1.The
Sampling distribution of
is centered μ and in most application the variance is smaller than
that of any over estimator of μ. Recall that
value of x that comes from a sampling distribution with a small
variance. According the central limit theorem we can expect the
sampling distribution of
to be approximately normaly distribution with mean μ, and standard
deviation σ
= μ and
written
- for the z value above which we find the area of
under the normal curve, we can see the next figure.
Point Estimate.
Def1. A
point estimate of some population parameter θ is single value
of a statistic
(большая
тета)
f(
)
n>>
- family parameter
Properties:
Unbiased Estimator (несмещенность)
A statistic a is said to be n unbiased estimator of a parameter θ is the
=E(
)=Θ
and we depicture in Figure
It
is clear iff 2 parameter -
and
is equal
51. Estimating the Mean.
Remark1.The Sampling distribution of is centered μ and in most application the variance is smaller than that of any over estimator of μ. Recall that value of x that comes from a sampling distribution with a small variance. According the central limit theorem we can expect the sampling distribution of to be approximately normaly distribution with mean μ, and standard deviation σ = μ and written - for the z value above which we find the area of under the normal curve, we can see the next figure.
P(-
)=1-α where z=
P(-
)=1-α ; P(
-
)=1-α
A random
sample of size n is selected from a population with variance
is known and the mean
computed to give the 100(1-α)% and the
(mean)
100(1-α)%
52. Prediction Intervals. (Интервал прогноза)
Def1.
Prediction interval of future observation if
known . For a normal distribution of measurement known
and unknown μ a 100(1-α)% prediction interval of future observation
is equal
Where
- value leaving an area of
to
the right.
Def2. Prediction Interval of future observation where is unknown for a normal distribution of measurement with unknown σ and known μ
100(1-α)%
Where
-
value with v= 1-n degrees of freedom leaving in area α/2 to right