next up previous
Next: WHAT IS REDUNDANCY REDUCTION? Up: INTRODUCTION Previous: INTRODUCTION

WHAT IS REDUNDANT INFORMATION?

The statistician's point of view is this: Suppose we draw input patterns from an $m$-dimensional distribution given by a random variable $X = (X_1, X_2, \ldots, X_m)$. The $p$-th pattern is a vector $x^p = (x^p_1, x^p_2, \ldots ,x^p_m)^T \in R^m$.

Redundancy means that pattern components share mutual information. It means that if we know certain components of some pattern, then we already know something about other components. Redundancy implies statistical dependence of the components:

\begin{displaymath}
P(X_i = x_i) \neq P(X_i = x_i \mid \{X_k=x_k, k \neq i \}) .
\end{displaymath}



Juergen Schmidhuber 2003-02-19