Thursday 17 October 2013

Covariance matrix



If X is a random vector
  \mathbf{X} = \begin{bmatrix}X_1 \\  \vdots \\ X_n \end{bmatrix},
the covariance matrix is


\Sigma
= \begin{bmatrix}
 \mathrm{E}[(X_1 - \mu_1)(X_1 - \mu_1)] & \mathrm{E}[(X_1 - \mu_1)(X_2 - \mu_2)] & \cdots & \mathrm{E}[(X_1 - \mu_1)(X_n - \mu_n)] \\ \\
 \mathrm{E}[(X_2 - \mu_2)(X_1 - \mu_1)] & \mathrm{E}[(X_2 - \mu_2)(X_2 - \mu_2)] & \cdots & \mathrm{E}[(X_2 - \mu_2)(X_n - \mu_n)] \\ \\
 \vdots & \vdots & \ddots & \vdots \\ \\
 \mathrm{E}[(X_n - \mu_n)(X_1 - \mu_1)] & \mathrm{E}[(X_n - \mu_n)(X_2 - \mu_2)] & \cdots & \mathrm{E}[(X_n - \mu_n)(X_n - \mu_n)]
\end{bmatrix}.


Det(\(\Sigma\)) = \(\prod_{i=1}^n\lambda_i \geq 0\) where \(\lambda_i\)'s are eigenvalues of \(\Sigma\).

Since \(\Sigma\) is symmetric and positive definite, it can be diagonalized and its eigenvalues are all real and positive and the eigenvectors are orthogonal.
\begin{align}
det(\Sigma) = det(V\Lambda V^T) = det(V)\cdot det(\Lambda) \cdot det(V^T) = det(\Lambda)
\end{align}

\(det(V) = \pm 1\) because \(det(VV^{-1}) = det(V)det(V^{-1}) = det(V)det(V^T) = det(V)^2 = 1\)

References:
http://www.ece.unm.edu/faculty/bsanthan/EECE-541/covar.pdf

No comments :

Post a Comment