What is Cohen's Kappa Coefficient?
Cohen's Kappa Coefficient measures the level of agreement between two raters, adjusting for agreement that might occur by chance alone. It provides a more robust measure than simple percent agreement because it accounts for the possibility that raters might agree randomly.
Formula
$$\kappa = \frac{p_{o} - p_{e}}{1 - p_{e}}$$
Where:
- p_o is the relative observed agreement among raters
- p_e is the hypothetical probability of chance agreement
Interpretation
| Kappa Value | Interpretation |
|---|---|
| < 0 | Poor agreement (less than chance) |
| 0.01 - 0.20 | Slight agreement |
| 0.21 - 0.40 | Fair agreement |
| 0.41 - 0.60 | Moderate agreement |
| 0.61 - 0.80 | Substantial agreement |
| 0.81 - 1.00 | Almost perfect agreement |
Calculation Example
- Observed Agreement (p_o): 0.85
- Chance Agreement (p_e): 0.52
$$\kappa = \frac{0.85 - 0.52}{1 - 0.52}$$
$$\kappa = \frac{0.33}{0.48}$$
$$\kappa \approx 0.6875$$
The Cohen's Kappa Coefficient is approximately 0.69, indicating substantial agreement between the raters.