- •2 The classical definition of probability.
- •3)Geometric definition of probability. The problem of meeting. Geometric Probability
- •4)Elements of the combinatory
- •5) A permutation. The number of permutations of n objects.
- •Example
- •6)A combination. The number of combinations of n distinct objects taken r at a time.
- •An Example of Combinations
- •7. Additive Rules (Addition formula of probability). 1) a1, a2, . . . , An are mutually exclusive. 2) a1, a2, . . . , An eny events.
- •8. Conditional probability.
- •9. Independence
- •Independent events
- •11. The Theorem of total probability.
- •12. Bayes’ Rule.
- •13. Bernoulli scheme. Bernoulli distribution.
- •14. Poisson approximation formula.
- •15. The Local Moivre-Laplace’s theorem
- •18. Independence of random variables.
- •21. Discrete random variable
- •22. Discrete Probability Distributions. Probability Density function.
- •23. Discrete Probability Distributions
- •24. Continuous distribution function
- •25. Continuous distribution function
- •Example
- •27. Joint Density Function
- •28. Conditional distribution
- •29. Statistical Independence
- •Independent events
- •30. Mathematical expectation
- •31. Mathematical expectation to the case of two random variables
- •32. Variance of random variables
- •33. Standard deviation.
- •35. Covariance of Random Variables
- •36. The correlation coefficient.
- •37. Means and Variances of Linear Combinations of Random Variables. We are still working towards finding the theoretical mean and variance of the sample mean:
- •Example
- •38. Chebyshev’s Theorem.
- •Example
- •39. Some Discrete Probability Distributions. Binomial and Multinomial Distributions.
- •40. Geometric Distribution.
- •41. Poisson Distribution.
- •42. Continuous Uniform Distribution. Normal Distribution.
- •43. Exponential Distributions.
- •44. Moments and Moment-Generating Functions.
- •45. Populations and Samples. Some Important Statistics.
- •46. Location Measures of a Sample: The Sample Mean, Median, and Mode.
- •1.Sample Mean.
- •Note that the statistics X(отрицание) assumes the value
- •47. The Sample Variance, Standard Deviation and Range.
- •48. The Central Limit Theorem.
- •49. The Likelihood Function.
- •50. Point estimate.
- •51. Estimating the Mean.
- •53. Single Sample: Estimating the Variance.
- •54. Sampling Distribution of s2.
- •55. Statistical Hypotheses: General Concepts.
- •56. Prove the formula of Poisson distribution:
46. Location Measures of a Sample: The Sample Mean, Median, and Mode.
The most commonly used statistics for measuring the central of set of data arranged in order of magnitude are the mean, median and mode. Let X1, X2, …, Xn represented n random variables.
1.Sample Mean.
Note that the statistics X(отрицание) assumes the value
Then
X1
assumes
the value x1,
X2
assumes x2
and so on. The statistics
and its computed value
.
2. Sample
Median. Given that observations in a sample are x1,
x2,
…, xn
arranged
in increasing order of magnitude the sample median
Remark2. The sample median is also allocation measure that shows the middle value of the sample.
3. The Sample Mode. The sample Mode is the value of the sample that occurs most often.
47. The Sample Variance, Standard Deviation and Range.
Variability in a samples displays how the observations spread out from the average.It is possible to have 2 sets of observation with the same mean or median that differ considerably in the variability of their measures about the average. Consider the following measures
For 2 samples of orange juice bottled by companies A and B
Sample A {0.97,1.00,0.94,1.03,1.06}
Sample B{1.06,1.01,0.88,0.91,1.14}
Both samples have 2 same mean = 1 liter . It is obvious that company A bottles orange juice with a more uniform content,than company B. We say that the variability or dispersion of the observation from the average is less for a sample A than for a sample B. Therefore in buying orange juice we would feel more confident that the bottle we select will be close to the average observation if we buy from company A.
Def1.
The sample variance denote by s2
is
giving by next formula.
The
sample standard deviation denote by s : s=
Remark1.
It should be clear that the sample standard deviation is in fact a
measure of variability. The quantity n-1 associated with the variance
estimate. The quantity is inside brackets sum to 0. In general
=0
Theorem. If s2 is the variance of random sample of size n we can write
Sample Range
Def2. R=xmax - xmin
Areas under the normal curve
Def3.
The distribution of the normal random variables with
is called a standard normal distribution.
Def4
(Norm Distribution)The density of the normal random variable X with
n(x, μ, σ) =
48. The Central Limit Theorem.
In probability theory, the central limit theorem (CLT) states that, given certain conditions, the mean of a sufficiently large number of independent random variables, each with finite mean and variance, will be approximately normally distributed.[1] The central limit theorem has a number of variants. In its common form, the random variables must be identically distributed. In variants, convergence of the mean to the normal distribution also occurs for non-identical distributions, given that they comply with certain conditions.
In more general probability theory, a central limit theorem is any of a set of weak-convergence theories. They all express the fact that a sum of manyindependent and identically distributed (i.i.d.) random variables, or alternatively, random variables with specific types of dependence, will tend to be distributed according to one of a small set of attractor distributions. When the variance of the i.i.d. variables is finite, the attractor distribution is the normal distribution. In contrast, the sum of a number of i.i.d. random variables with power law tail distributions decreasing as |x|−α−1 where 0 < α < 2 (and therefore having infinite variance) will tend to an alpha-stable distribution with stability parameter (or index of stability) of α as the number of variables grows.
Population
from which samples are taken has a probability distribution with
and
.
That is not necessary a normal distribution then the standardized
variables associated with
given by the formula z=
is asimptotically normal the limit
=
du
Remark4.
The normal approximation for
will generally good if n
trials provided the population distribution is not terribly skewed.
If n<30 the approximation is good only if the population is not
too different from normal.
The sample distribution of will follow a normal distribution exactly,no matter how small the size of samples.
