next up previous contents
Next: Soft Cluster Ensembles Up: Future Work Previous: Future Work   Contents

Greedy Maximization for Cluster Ensembles

The Average Normalized Mutual Information (ANMI) objective defines the purpose of the cluster ensemble as proposed in chapter 5. We desire a more thorough theoretical analysis of ANMI to better understand its properties and develop better optimization schemes. Investigating the ANMI objective might also result in a better understanding of the biases of the three proposed consensus functions.

Based on this, we plan to explore greedy optimization schemes. As an enhancement for our proposed consensus functions in applications where $ n$ is not too large, a greedy post-processing step can be used to refine the best labeling of the three algorithms from section 5.2 as follows: For each object, we exchange the current label with any of the $ k-1$ possible other labels and evaluate the ANMI objective. If the ANMI increases, we change the object's label to the best new value and repeat the trials. Thereby we greedily optimize the objective through single label changes. When for all objects there is no label change that improves the objective, a local optimum has been reached and this is the final consensus labeling. Like all local optimization procedures, there is a strong dependency on the initialization. Running this greedy search starting with a random labeling is often computationally intractable and tends to result in poor local optima. However, using the best of the three consensus functions as an initialization, the greedy search might work well. Preliminary experiments indicate that this post-processing affects between 0% - 5% of the labels and yields slightly improved results. Adaptive step sizes (e.g., modifying larger label segments / blocks) and less local search techniques (e.g., maintaining a population of current consensus clusterings instead of just one) might be promising to investigate for greedy search.

Related work on integer programming techniques, Independent Component Analysis (ICA), and Maximum Mutual Information (MMI) techniques might also yield insights towards direct optimization of the objective.

We also explored extending the ANMI objective to be computed from sets of soft clusterings which yields a smooth (numerically differentiable) input space. However, preliminary experiments using gradient ascent techniques on the continuous cluster association input space failed. This is probably due to the objective not being well behaved and the large dimensionality of the input space for optimization.


next up previous contents
Next: Soft Cluster Ensembles Up: Future Work Previous: Future Work   Contents
Alexander Strehl 2002-05-03