Skip to main content

Table 6 Agreement/disagreement between two instruments using three measures

From: Measuring agreement between healthcare survey instruments using mutual information

Data set Measure
Odds ratio Cohen’s kappa Local mutual information
(Table 2 in Shulman et al. 1986) Clock exam and MMSEa 15.6 0.493 Agreement (0.422 in I agreement , -0.212 in I disagreement )
(Figure 1 in Russell et al. 2012) BDIb + CDRS_Rc and ICD-10d ∞** 0.693 Agreement (0.18 in I agreement , -0.03 in I disagreement )
(Figure 1 in Russell et al. 2012) CDRS_R and ICD-10 0.311* -0.015* Inconclusive ( -0.018 in I agreement , 0.024 in I disagreement )
(Table 4 in Seago 2002) Score scheme comparison 4 0.265 Agreement ( 0.152 in I agreement , -0.107 in I disagreement )
  1. (Note: * insignificant at confidence level = 95 %, ** Infinity since one of the cells contains 0, aMini mental state examination, bBeck Depression Inventory, cChildren’s Depression Rating Scale-Revised, dInternational Classification Disease – 10 clinical interview)