Processing math: 100%

Thursday, 17 October 2013

Covariance matrix



If X is a random vector
  \mathbf{X} = \begin{bmatrix}X_1 \\  \vdots \\ X_n \end{bmatrix},
the covariance matrix is


\Sigma
= \begin{bmatrix}
 \mathrm{E}[(X_1 - \mu_1)(X_1 - \mu_1)] & \mathrm{E}[(X_1 - \mu_1)(X_2 - \mu_2)] & \cdots & \mathrm{E}[(X_1 - \mu_1)(X_n - \mu_n)] \\ \\
 \mathrm{E}[(X_2 - \mu_2)(X_1 - \mu_1)] & \mathrm{E}[(X_2 - \mu_2)(X_2 - \mu_2)] & \cdots & \mathrm{E}[(X_2 - \mu_2)(X_n - \mu_n)] \\ \\
 \vdots & \vdots & \ddots & \vdots \\ \\
 \mathrm{E}[(X_n - \mu_n)(X_1 - \mu_1)] & \mathrm{E}[(X_n - \mu_n)(X_2 - \mu_2)] & \cdots & \mathrm{E}[(X_n - \mu_n)(X_n - \mu_n)]
\end{bmatrix}.


Det(Σ) = ni=1λi0 where λi's are eigenvalues of Σ.

Since Σ is symmetric and positive definite, it can be diagonalized and its eigenvalues are all real and positive and the eigenvectors are orthogonal.
det(Σ)=det(VΛVT)=det(V)det(Λ)det(VT)=det(Λ)

det(V)=±1 because det(VV1)=det(V)det(V1)=det(V)det(VT)=det(V)2=1

References:
http://www.ece.unm.edu/faculty/bsanthan/EECE-541/covar.pdf

No comments :

Post a Comment