Poster Open Access
Poster presentation given at ISSI 2019 conference and the related paper.
In these, we describe our effort, motivated by the 2019 research assessment of the University of Helsinki, to understand and test results from the tool RunNetworkClustering by CWTS, Leiden on a citation network containing about 67000 publications. The network was compiled and studying the network partitions produced by the tool were done using the statistical software R. Our finding was that clusters based on partitions from this clustering method vary and normalized indicators fluctuate despite similar high quality function values and no clear criterion was found to choose a robust clustering. Also, the 2018 published Leiden algorithm, included in the tool, uses random numbers to explore the space of network partitions and therefore the algorithm finds varying partitions even with identical assigned parameter values. Due to limited space in the poster, we did not report all tests that were made, e.g., raising the number of iterations up to 2000 and using only the so called giant component of the network. The latter is problematic in the context of research assessment because all publications should be somehow taken into account, and the conclusions and speculations were not altered based on these tests. In conclusion, citation network clustering is possibly an important step in finding more fundamental partitions of scientific research into fields and sub-fields, but work is needed to find criteria for evaluating the results from the point of view of cluster contents.