Journal article Open Access

Mutual Information as a Measure of Intercoder Agreement

Klemens, Ben

In a situation where two raters are classifying a series of observations, it is useful to have an index of agreement among the raters that takes into account both the simple rate of agreement and the complexity of the rating task. Information theory provides a measure of the quantity of information in a list of classifications which can be used to produce an appropriate index of agreement. A normalized weighted mutual information index improves upon the traditional intercoder agreement index in a number of ways, key being that there is no need to develop a model of error generation before use; comparison across experiments is easier; and that ratings are based on the distribution of agreement across categories, not just an overall agreement level.

Files (166.8 kB)
Name Size
166.8 kB Download
All versions This version
Views 8282
Downloads 2424
Data volume 4.0 MB4.0 MB
Unique views 7373
Unique downloads 1919


Cite as