
- •2 The classical definition of probability.
- •3)Geometric definition of probability. The problem of meeting. Geometric Probability
- •4)Elements of the combinatory
- •5) A permutation. The number of permutations of n objects.
- •Example
- •6)A combination. The number of combinations of n distinct objects taken r at a time.
- •An Example of Combinations
- •7. Additive Rules (Addition formula of probability). 1) a1, a2, . . . , An are mutually exclusive. 2) a1, a2, . . . , An eny events.
- •8. Conditional probability.
- •9. Independence
- •Independent events
- •11. The Theorem of total probability.
- •12. Bayes’ Rule.
- •13. Bernoulli scheme. Bernoulli distribution.
- •14. Poisson approximation formula.
- •15. The Local Moivre-Laplace’s theorem
- •18. Independence of random variables.
- •21. Discrete random variable
- •22. Discrete Probability Distributions. Probability Density function.
- •23. Discrete Probability Distributions
- •24. Continuous distribution function
- •25. Continuous distribution function
- •Example
- •27. Joint Density Function
- •28. Conditional distribution
- •29. Statistical Independence
- •Independent events
- •30. Mathematical expectation
- •31. Mathematical expectation to the case of two random variables
- •32. Variance of random variables
- •33. Standard deviation.
- •35. Covariance of Random Variables
- •36. The correlation coefficient.
- •37. Means and Variances of Linear Combinations of Random Variables. We are still working towards finding the theoretical mean and variance of the sample mean:
- •Example
- •38. Chebyshev’s Theorem.
- •Example
- •39. Some Discrete Probability Distributions. Binomial and Multinomial Distributions.
- •40. Geometric Distribution.
- •41. Poisson Distribution.
- •42. Continuous Uniform Distribution. Normal Distribution.
- •43. Exponential Distributions.
- •44. Moments and Moment-Generating Functions.
- •45. Populations and Samples. Some Important Statistics.
- •46. Location Measures of a Sample: The Sample Mean, Median, and Mode.
- •1.Sample Mean.
- •Note that the statistics X(отрицание) assumes the value
- •47. The Sample Variance, Standard Deviation and Range.
- •48. The Central Limit Theorem.
- •49. The Likelihood Function.
- •50. Point estimate.
- •51. Estimating the Mean.
- •53. Single Sample: Estimating the Variance.
- •54. Sampling Distribution of s2.
- •55. Statistical Hypotheses: General Concepts.
- •56. Prove the formula of Poisson distribution:
35. Covariance of Random Variables
Def1. A random variable is a function what associated real number with each element in a sample space.
Def2. The set of ordered pairs (X, f(x)) is a probability function or probability math function or probability distribution of a discrete random variable X for each possible outcomes:
f(x) ≥ 0
Def3. Let X is a random variable with probability distribution f(x). The mean or expected value of X is µ = E(X) =
Def4. Let X is a random variable with probability distribution f(x) and expected value µ. The variance of X is equal If the variance is positive root, we called the standard deviation of X. If the negative we study the variance of probability distribution of X, there we denote from
Def5.
Let X be a random variable with probability function f(x) and mean or
expected value µ. The variance of X is equal
Def6.
Let
X and Y random variables with joint probability distribution f(x,y).
the covariance of random variables X and Y we denote that
.
There X and Y are discrete.
Def7.
Let X and Y random variables with covariance
and standard deviation
and
respectively a correlation ρxy is equal ρxy=
.
Remark1. There x and y are free version of covariance called is correlation coefficient and used widely in statistic.
36. The correlation coefficient.
Def1. A random variable is a function what associated real number with each element in a sample space.
Def2. The set of ordered pairs (X, f(x)) is a probability function or probability math function or probability distribution of a discrete random variable X for each possible outcomes:
f(x) ≥ 0
Def3. Let X is a random variable with probability distribution f(x). The mean or expected value of X is µ = E(X) =
Def4. Let X is a random variable with probability distribution f(x) and expected value µ. The variance of X is equal If the variance is positive root, we called the standard deviation of X. If the negative we study the variance of probability distribution of X, there we denote from
Def5. Let X be a random variable with probability function f(x) and mean or expected value µ. The variance of X is equal
Def6. Let X and Y random variables with joint probability distribution f(x,y). the covariance of random variables X and Y we denote that . There X and Y are discrete.
Def7. Let X and Y random variables with covariance and standard deviation and respectively a correlation ρxy is equal ρxy= .
Remark1. There x and y are free version of covariance called is correlation coefficient and used widely in statistic.
37. Means and Variances of Linear Combinations of Random Variables. We are still working towards finding the theoretical mean and variance of the sample mean:
Xˉ=X1+X2+⋯+Xnn
If we re-write the formula for the sample mean just a bit:
Xˉ=1nX1+1nX2+⋯+1nXn
we can see more clearly that the sample mean is a linear combination of the random variables X1, X2, ..., Xn. That's why the title and subject of this page! That is, here on this page, we'll add a few a more tools to our toolbox, namely determining the mean and variance of a linear combination of random variables X1, X2, ..., Xn. Before presenting and proving the major theorem on this page, let's revisit again, by way of example, why we would expect the sample mean and sample variance to have a theoretical mean and variance.