- •Distribution Overview
- •Discrete Distributions
- •Continuous Distributions
- •Probability Theory
- •Random Variables
- •Transformations
- •Expectation
- •Variance
- •Inequalities
- •Distribution Relationships
- •Probability and Moment Generating Functions
- •Multivariate Distributions
- •Standard Bivariate Normal
- •Bivariate Normal
- •Multivariate Normal
- •Convergence
- •Statistical Inference
- •Point Estimation
- •Empirical distribution
- •Statistical Functionals
- •Parametric Inference
- •Method of Moments
- •Maximum Likelihood
- •Delta Method
- •Multiparameter Models
- •Multiparameter delta method
- •Parametric Bootstrap
- •Hypothesis Testing
- •Bayesian Inference
- •Credible Intervals
- •Function of parameters
- •Priors
- •Conjugate Priors
- •Bayesian Testing
- •Exponential Family
- •Sampling Methods
- •The Bootstrap
- •Rejection Sampling
- •Importance Sampling
- •Decision Theory
- •Risk
- •Admissibility
- •Bayes Rule
- •Minimax Rules
- •Linear Regression
- •Simple Linear Regression
- •Prediction
- •Multiple Regression
- •Model Selection
- •Non-parametric Function Estimation
- •Density Estimation
- •Histograms
- •Kernel Density Estimator (KDE)
- •Smoothing Using Orthogonal Functions
- •Stochastic Processes
- •Markov Chains
- •Poisson Processes
- •Time Series
- •Stationary Time Series
- •Estimation of Correlation
- •Detrending
- •ARIMA models
- •Causality and Invertibility
- •Spectral Analysis
- •Math
- •Gamma Function
- •Beta Function
- •Series
- •Combinatorics
Cross-validation estimate of E [J(h)]
0 12
nJ
XX
RbCV (J) = @Yi j(xi)bj;( i)A
i=1 j=1
20 Stochastic Processes
Stochastic Process
X |
|
: t |
T |
T = f0; 1; : : : g = Z |
discrete |
||
f |
t |
|
2 g |
([0; |
1 |
) |
continuous |
|
|
Notations Xt, X(t)
State space X
Index set T
20.1Markov Chains
Markov chain
P [Xn = x j X0; : : : ; Xn 1] = P [Xn = x j Xn 1] 8n 2 T; x 2 X
Transition probabilities
pij P [Xn+1 = j j Xn = i]
pij(n) P [Xm+n = j j Xm = i] n-step
Transition matrix P (n-step: Pn)
(i; j) element is pij
pij > 0
P
i pij = 1
Chapman-Kolmogorov
X
pij(m + n) = pij(m)pkj(n)
k
Pm+n = PmPn
Pn = P P = Pn
Marginal probability
n = ( n(1); : : : ; n(N)) where i(i) = P [Xn = i]
0 , initial distribution
n = 0Pn
20.2Poisson Processes
Poisson process
fXt : t 2 [0; 1)g = number of events up to and including time t
X0 = 0
Independent increments:
8t0 < < tn : Xt1 Xt0 ?? ?? Xtn Xtn 1
Intensity function (t)
{P [Xt+h Xt = 1] = (t)h + o(h)
{P [Xt+h Xt = 2] = o(h)
Xs+t Xs Po (m(s + t) m(s)) where m(t) = R0t (s) ds
Waiting times
|
1 |
|
Wt Gamma t; |
||
St = Wt+1 Wt |
|
|
St |
Exp |
|
|
1 |
|
|
St |
|
Wt 1 |
Wt |
t |
|
|
|
Homogeneous Poisson process |
|
|
(t) =) Xt Po ( t) |
> 0 |
Wt := time at which Xt occurs
Interarrival times
22