next up previous contents
Next: Bibliography Up: Derivations Previous: Normalized Symmetric Mutual Information   Contents


Normalized Asymmetric Mutual Information

The entropy of either, clustering $ X$ and categorization $ Y$, provides another tight bound on mutual information $ I(X,Y)$ that can be used for normalization. Since the categorization $ Y$ is a stable, user given distribution, let's consider

$\displaystyle I(X,Y) \leq H(Y) .$ (8.10)

Hence, one can alternatively define [0,1]-normalized asymmetric mutual information based quality as

$\displaystyle NI(X,Y) = \frac{I(X,Y)}{H(Y)}$ (8.11)

which translates into frequency counts as

$\displaystyle \phi^{(\mathrm{NAMI})} (\mathbf{\lambda},\mathbf{\kappa}) = \frac...
...} n} { n^{(h)} n_{\ell} } } { - \sum_{h=1}^g n^{(h)} \log \frac{n^{(h)}}{n} } .$ (8.12)

Note that this definition of mutual information is asymmetric and does not penalize overrefined clusterings $ \mathbf{\lambda}$. Also, $ \phi^{(\mathrm{NAMI})}$ is biased towards high $ k$.



Alexander Strehl 2002-05-03