Login  Register

Re: Cohen's Kappa for multiple raters

Posted by bdates on Nov 09, 2006; 5:00pm
URL: http://spssx-discussion.165.s1.nabble.com/Cohen-s-Kappa-for-multiple-raters-tp1072037p1072039.html

Paul,

The negative kappa is an indication that the degree of agreement is less
than would be expected by chance.  What you've run into is the paradox that
occurs with kappa and most kappa-like statistics that Feinstein and Cichetti
first mentioned.  As marginal homogeneity decreases (trait prevalence
becomes more skewed), the value of kappa decreases in spite of the fact that
rater agreement might be very high.  Scott's pi, Cohen's kappa, and Conger's
kappa were all developed based on the assumption of marginal homogeneity.
The greater the deviation, the more kappa is diminished.  This is one of the
criticisms of kappa type statistics. Regarding your second question, use the
kappa value of -0.0345.  The 0.1155 is the asymptotic standard error for use
in computing confidence intervals.

Brian

Brian G. Dates, Director of Quality Assurance
Southwest Counseling and Development Services
1700 Waterman
Detroit, Michigan  48209
Telephone: 313.841.7442
FAX:  313.841.4470
email: [hidden email]


> -----Original Message-----
> From: Paul Mcgeoghan [SMTP:[hidden email]]
> Sent: Thursday, November 09, 2006 11:33 AM
> To:   [hidden email]
> Subject:      Cohen's Kappa for multiple raters
>
> Hi,
>
> I am using the syntax below from Raynald's SPSS Tools website:
> http://www.spsstools.net/Syntax/Matrix/CohensKappa.txt
>
> In my case I have 6 raters rating 5 subjects and there are 2 categories so
> the data is as below:
>
> Subj   rater1            rater2    rater3         rater4       rater5
> rater6
> 1         2                         2            2               2
> 2               2
> 2         2                         2            2               2
> 2               2
> 3         2                         2            2               2
> 2               2
> 4         1                         2            2               2
> 2               2
> 5         2                         2            2               2
> 2               2
>
> This gives a value of -.0345 which indicates no agreement according to the
> following article:
> http://en.wikipedia.org/wiki/Fleiss%27_kappa
>
> Most of the raters agree in the above table so why is Cohen's Kappa
> negative, indicating no
> agreement.
>
> Also in the output, it gives Cohen's Kappa where Kappa is -.0345 and
> Cohen's Kappa Fleiss - adjusted standard error as .1155
>
> Which value do I use for Cohen's Kappa among multiple raters, is it -.0345
> or .1155?
>
> Paul
>
>
> ==================
> Paul McGeoghan,
> Application support specialist (Statistics and Databases),
> University Infrastructure Group (UIG),
> Information Services,
> Cardiff University.
> Tel. 02920 (875035).
>
>
Confidentiality Notice for Email Transmissions: The information in this
message is confidential and may be legally privileged. It is intended solely
for the addressee.  Access to this message by anyone else is unauthorised.
If you are not the intended recipient, any disclosure, copying, or
distribution of the message, or any action or omission taken by you in
reliance on it, is prohibited and may be unlawful.  Please immediately
contact the sender if you have received this message in error. Thank you.