Login  Register

Re: Cohen's Kappa for multiple raters

Posted by Meyer, Gregory J on Nov 09, 2006; 5:00pm
URL: http://spssx-discussion.165.s1.nabble.com/Cohen-s-Kappa-for-multiple-raters-tp1072037p1072038.html

Paul, the coefficient is so low because there is almost no measurable
individual differences in your subjects. They all receive values of 2 by
five of the six raters. Only one subject receives a value of 1 by just
one of the raters. Kappa, or any coefficient of agreement (e.g.,
correlation), would be impossible to compute if you just looked at data
from Raters 2 through 6 because there would be no variability at all
(i.e., all scores are a constant). Unless the rating scale needs to be
applied in such a homogeneous sample, the way to address this is to get
a larger and more diverse sample of subjects included in the analyses.

Greg

| -----Original Message-----
| From: SPSSX(r) Discussion [mailto:[hidden email]]
| On Behalf Of Paul Mcgeoghan
| Sent: Thursday, November 09, 2006 11:33 AM
| To: [hidden email]
| Subject: Cohen's Kappa for multiple raters
|
| Hi,
|
| I am using the syntax below from Raynald's SPSS Tools website:
| http://www.spsstools.net/Syntax/Matrix/CohensKappa.txt
|
| In my case I have 6 raters rating 5 subjects and there are 2
| categories so the data is as below:
|
| Subj   rater1            rater2    rater3         rater4
|  rater5    rater6
| 1         2                         2            2
|    2               2               2
| 2         2                         2            2
|    2               2               2
| 3         2                         2            2
|    2               2               2
| 4         1                         2            2
|    2               2               2
| 5         2                         2            2
|    2               2               2
|
| This gives a value of -.0345 which indicates no agreement
| according to the following article:
| http://en.wikipedia.org/wiki/Fleiss%27_kappa
|
| Most of the raters agree in the above table so why is Cohen's
| Kappa negative, indicating no
| agreement.
|
| Also in the output, it gives Cohen's Kappa where Kappa is -.0345 and
| Cohen's Kappa Fleiss - adjusted standard error as .1155
|
| Which value do I use for Cohen's Kappa among multiple raters,
| is it -.0345 or .1155?
|
| Paul
|
|
| ==================
| Paul McGeoghan,
| Application support specialist (Statistics and Databases),
| University Infrastructure Group (UIG),
| Information Services,
| Cardiff University.
| Tel. 02920 (875035).
|