How Many Clusters? An Entropic Approach to Hierarchical Cluster Analysis

Published in Intelligent Computing. SAI 2020. Advances in Intelligent Systems and Computing, vol 1230, 2020

Recommended citation: Koltcov S., Ignatenko V., Pashakhin S. (2020) How Many Clusters? An Entropic Approach to Hierarchical Cluster Analysis. In: Arai K., Kapoor S., Bhatia R. (eds) Intelligent Computing. SAI 2020. Advances in Intelligent Systems and Computing, vol 1230. Springer, Cham. https://doi.org/10.1007/978-3-030-52243-8_40 http://doi.org/10.1007/978-3-030-52243-8_40

Clustering large and heterogeneous data of user-profiles from social media is problematic as the problem of finding the optimal number of clusters becomes more critical than for clustering smaller and homogeneous data. We propose a new approach based on the deformed Rényi entropy for determining the optimal number of clusters in hierarchical clustering of user-profile data. Our results show that this approach allows us to estimate Rényi entropy for each level of a hierarchical model and find the entropy minimum (information maximum). Our approach also shows that solutions with the lowest and the highest number of clusters correspond to the entropy maxima (minima of information).

DOI

Recommended citation: Koltcov S., Ignatenko V., Pashakhin S. (2020) How Many Clusters? An Entropic Approach to Hierarchical Cluster Analysis. In: Arai K., Kapoor S., Bhatia R. (eds) Intelligent Computing. SAI 2020. Advances in Intelligent Systems and Computing, vol 1230. Springer, Cham. https://doi.org/10.1007/978-3-030-52243-8_40