Login  Register

Re: Fleiss Kappa

Posted by Martin Holt-3 on Jan 25, 2015; 12:48pm
URL: http://spssx-discussion.165.s1.nabble.com/Fleiss-Kappa-tp5728467p5728474.html

A very useful text for this is 

Streiner, D.L., Norman G.R. "Health Measurement Scales: A Practical Guide to their Development and Use" 2nd Ed. Oxford Medical Publications. 1995.

You can obtain a cheap copy if you visit "Abebooks".

You might find that the Google Group "MedStats" serves you well with regard to this discussion.

Kind Regards,

Martin


 
Martin P. Holt

Freelance Medical Statistician and Quality Expert

[hidden email]

Persistence and Determination Alone are Omnipotent !

If you can't explain it simply, you don't understand it well enough.....Einstein



Linked In: https://www.linkedin.com/profile/edit?trk=nav_responsive_sub_nav_edit_profile


From: Rich Ulrich <[hidden email]>
To: [hidden email]
Sent: Saturday, 24 January 2015, 3:00
Subject: Re: Fleiss Kappa

Here are my reactions to the task --

It seems to me that you get everything that you want about an item
when you look at the mean of the 3 or 4 raters: "0.0" says that they
agreed on absence, and "1.0" says that they agreed on presence --
which do not mean the same thing.  Once you have listed the items
in decreasing order, what is there to add?  - count how many are 0
and how many are 1?

It seems unwarranted and not useful to compute Fleiss's Kappa between
pairs of raters....  "Between a pair" is how the original Kappa is used, and
how I prefer to use it.  I don't gain much useful insight from knowing that
they merely "agree," without the direction. 

--
Rich Ulrich



> Date: Fri, 23 Jan 2015 19:13:31 -0700

> From: [hidden email]
> Subject: Re: Fleiss Kappa
> To: [hidden email]
>
> Here is the text of the original...
> "I'd like feedback and suggestions on my intended use of Fleiss' Kappa to
> assess interrater agreement in a job analysis study that we are doing for a
> new job that has no existing incumbents.
>
> In this job analysis we are collecting interview data from about 40 subject
> matter experts. Our interviews are essentially detailed discussions of
> working conditions/environment, work tasks, and requisite KSAOs (knowledge,
> skills, abilities, and other characteristics) that are important for
> employees in this particular job. We have a list of about 50 personal
> characteristics (e.g., persistence, trustworthiness, creative thinking
> ability) that our literature review suggests are likely to be related to the
> particular job we are analyzing. We intend to have 3 or 4 raters read all
> interviews and rate each on the presence or absence of all 50 job-related
> factors. Since our ratings are categorical (yes/no), it appears that
> Fleiss' Kappa is the proper interrater agreement statistic, but all of the
> illustrations, that I have seen, of the use of this statistic are for a
> single assignment-to-category decision, not for multiple such assignments.
> This suggests that we will have to calculate a Fleiss Kappa for each of our
> 50 personal characteristics and then combine them (a simple mean?) to obtain
> an indication of overall interrater agreement.
>
> My questions are: (1) Is this approach of calculating 50 separate Fleiss
> Kappas and then averaging them the best approach? (2) Is there a way
> (existing SPSS tool or Excel spreadsheet) that allows all calculations to be
> done in one effort, or do we have to repeat the calculation 50 times? (3)
> Just to help settle my theoretical ruminations: If two raters do not see a
> given personal characteristic in a transcript, is this agreement as
> meaningful as if two raters do see a given personal characteristic.
> Intuitively, it seems that a positive affirmation of the presence of a
> personal characteristic is more meaningful to the aims of the study because
> absence of mention doesn't necessarily mean that the characteristic is not
> important.
>
> Thanks in advance for your thoughts."
> ------
> My suggestion was a search of this group because it has been previously
> discussed in some detail many times in the past.
> Brian Dates' posts in particular.
> It occurs to me also that there is an EXTENSION command for this (IIRC)
>
>

===================== To manage your subscription to SPSSX-L, send a message to [hidden email] (not to SPSSX-L), with no body text except the command. To leave the list, send the command SIGNOFF SPSSX-L For a list of commands to manage subscriptions, send the command INFO REFCARD


===================== To manage your subscription to SPSSX-L, send a message to [hidden email] (not to SPSSX-L), with no body text except the command. To leave the list, send the command SIGNOFF SPSSX-L For a list of commands to manage subscriptions, send the command INFO REFCARD