Exact one-sided confidence limits for Cohen's kappa as a measurement of agreement

Document Type

Article

Publication Date

1-1-2017

Publication Title

Statistical Methods in Medical Research

Volume

26

Issue

2

First page number:

615

Last page number:

632

Abstract

Cohen's kappa coefficient, κ, is a statistical measure of inter-rater agreement or inter-annotator agreement for qualitative items. In this paper, we focus on interval estimation of κ in the case of two raters and binary items. So far, only asymptotic and bootstrap intervals are available for κ due to its complexity. However, there is no guarantee that such intervals will capture κ with the desired nominal level 1-α. In other words, the statistical inferences based on these intervals are not reliable. We apply the Buehler method to obtain exact confidence intervals based on four widely used asymptotic intervals, three Wald-type confidence intervals and one interval constructed from a profile variance. These exact intervals are compared with regard to coverage probability and length for small to medium sample sizes. The exact intervals based on the Garner interval and the Lee and Tu interval are generally recommended for use in practice due to good performance in both coverage probability and length. © The Author(s) 2014.

Language

english

UNLV article access

Search your library

Share

COinS