Skip to main content

Table 2 The interrater reliability of the reviews among the three reviewers

From: Mining of EHR for interface terminology concepts for annotating EHRs of COVID patients

Version

Procedure

Percent agreement for 1st review

Fleiss’ Kappa for 1st review

Percent agreement for 2nd review

Fleiss’ Kappa for 2nd review

CIT_V1.1

Concatenation

0.85

0.46

0.91

0.67

CIT_V1.2

Anchoring

0.90

0.60

0.94

0.78

CIT_V2.1

Concatenation

0.87

0.43

0.92

0.67

CIT_V2.2

Anchoring

0.91

0.54

0.94

0.72

CIT_V3.1

Concatenation

0.84

0.37

0.92

0.69

CIT_V3.2

Anchoring

0.92

0.66

–

–

CIT_V4.1

Concatenation

0.83

0.23

–

–

CIT_V4.2

Anchoring

0.96

0.58

–

–