Dear Brian,
Thanks a lot for the macro, I was able to launch it and it works fine. However, overall Fleiss kappa is not perfect for my project as it depends heavily on
how many ratings I have for each category in the total cohort of patients regardless of the agreement between raters for individual patients (so for example imagine that I have 100% agreement between raters and two possible situations: A) 6 patients were diagnosed
with A and 6 patients were diagnosed with B; B) 2 patients were diagnosed with A and 10 were diagnosed with B. In these two situations overall Fleiss kappa will be different although there is 100% agreement between raters for individual patients. Worth mentioning
that there is no gold standard for the diagnosis in my cohort (the ultimate diagnosis is not known).
I think the only thing I can do with Fleiss kappa is to calculate the proportion of inter-rater agreement for each patient.
Do you think Krippendorf alpha or Gwet AC would work better for this project?
With kind regards,
Mack
From: bdates [via SPSSX Discussion] [mailto:ml-node+[hidden email]]
Sent: 05 November 2014 14:04
To: Maciej Jurynczyk
Subject: Re: Inter-rater agreement for multiple raters or something else?
Mack,
The Excel sheet on Jason's website is mine, but there are problems with it. For some reason, it keeps getting corrupted, so I'd be careful about the results. I'll send my macro offline. More to the point, Fleiss is one of the few authors that provided 'official'
formulae for category kappa's as well as an overall solution. I'd be interested if he actually had formulae for individual cases/subjects. I've never seen that in the literature. As an idea, you could write syntax to set up a loop for each item that would
count each of the four values assigned to your diagnoses for each case, then compute a variable that would count the number of diagnoses with more than one, or two, or ... occurrences (whatever value you set as a cutoff). That would give you an idea of the
raw agreement and help distinguish 'difficult' patients from 'easy' patients.
Brian Dates, M.A.
Director of Evaluation and Research | Evaluation & Research | Southwest Counseling Solutions
Southwest Solutions
1700 Waterman, Detroit, MI 48209
313-841-8900 (x7442) office | 313-849-2702 fax
[hidden email] |
www.swsol.org
-----Original Message-----
From: SPSSX(r) Discussion [mailto:[hidden email]] On Behalf Of MJury
Sent: Tuesday, November 04, 2014 6:03 PM
To: [hidden email]
Subject: Re: Inter-rater agreement for multiple raters or something else?
Dear Art and David, thanks so much for your interest!
A, B, C and D are four possible diagnostic categories and raters were asked to choose only one of them (mutually exclusive and exhaustive categories).
Patients are arranged in rows and raters in columns. I calculated overall Fleiss kappa for all patients and all raters but I would be also interested in identifying patients that are the most debatable/controversial from the diagnostic point of view. I am not
sure whether Fleiss kappa is the best solution here as I understand its main goal is to assess reliability of raters while I am more interested in recognizing clinical phenotypes that cause disagreement between raters. Maybe Fleiss kappa's pi (=the extent
to which raters agree for the i-th subject) would be appropriate? However the Fleiss kappa Excel spreadsheet I downloaded from Jason E. King's website does not calculate that. Brian, thanks a lot for your kindness, does your macro calculate pi for Fleiss kappa?
I would appreciate any comments.
With best regards,
Mack
--
View this message in context:
http://spssx-discussion.1045642.n5.nabble.com/Inter-rater-agreement-for-multiple-raters-or-something-else-tp5727786p5727790.html
Sent from the SPSSX Discussion mailing list archive at Nabble.com.
=====================
To manage your subscription to SPSSX-L, send a message to
[hidden email] (not to SPSSX-L), with no body text except the command. To leave the list, send the command SIGNOFF SPSSX-L For a list of commands to manage subscriptions, send the command INFO REFCARD
=====================
To manage your subscription to SPSSX-L, send a message to
[hidden email] (not to SPSSX-L), with no body text except the
command. To leave the list, send the command
SIGNOFF SPSSX-L
For a list of commands to manage subscriptions, send the command
INFO REFCARD
If you reply to this email, your message will be added to the discussion below:
To unsubscribe from Inter-rater agreement for multiple raters or something else?,
click here.
NAML
Free forum by Nabble | Edit this page |