|
Hello everyone,
I am trying to figure out how to calculate a Kappa statistic or perhaps some other inter-rater agreement statistic such as an intraclass correlation. I have 40 raters who rated 6 cases as passing or failing using a new instrument. I would like to get a statistic that gives me a measure of interrater agreement among these 40 doctors. With SPSS I only know how to get a kappa on 2 raters using the crosstabs function. Any suggestions here? Many thanks in advance. Janell Mensinger ===================== To manage your subscription to SPSSX-L, send a message to [hidden email] (not to SPSSX-L), with no body text except the command. To leave the list, send the command SIGNOFF SPSSX-L For a list of commands to manage subscriptions, send the command INFO REFCARD |
|
Janell,
Whether you use a kappa statistic or ICC will depend on the nature of your data. If it is categorical in nature, then kappa is indicated. However, if the data are on an interval scale, then you can use an ICC. If you decide to use a kappa statistic, I have syntax on Jason King's website at Baylor College of Medicine that will conduct analysis for any number of raters and categories. The syntax there will provide solutions for Bennett's S, Fleiss' generalized kappa (often misreferred to as Cohen's kappa), the real Cohen's kappa, Conger's kappa, and Gwet's AC1. Andy Hayes at the Ohio State University has similar syntax for Krippendorff's alpha on his website, which uses a bootstrap approach. If you would prefer to have syntax for a single kappa-type statistic rather than the entire batch, please contact me offline and I will send it to you. Baylor College of Medicine Website http://www.ccitonline.org/jking/homepage/ Andy Hayes at OSU http://www.comm.ohio-state.edu/ahayes/ Good luck, Brian -----Original Message----- From: SPSSX(r) Discussion [mailto:[hidden email]] On Behalf Of Janell Mensinger Sent: Monday, November 26, 2007 4:53 PM To: [hidden email] Subject: inter-rater reliability Hello everyone, I am trying to figure out how to calculate a Kappa statistic or perhaps some other inter-rater agreement statistic such as an intraclass correlation. I have 40 raters who rated 6 cases as passing or failing using a new instrument. I would like to get a statistic that gives me a measure of interrater agreement among these 40 doctors. With SPSS I only know how to get a kappa on 2 raters using the crosstabs function. Any suggestions here? Many thanks in advance. Janell Mensinger ===================== To manage your subscription to SPSSX-L, send a message to [hidden email] (not to SPSSX-L), with no body text except the command. To leave the list, send the command SIGNOFF SPSSX-L For a list of commands to manage subscriptions, send the command INFO REFCARD ===================== To manage your subscription to SPSSX-L, send a message to [hidden email] (not to SPSSX-L), with no body text except the command. To leave the list, send the command SIGNOFF SPSSX-L For a list of commands to manage subscriptions, send the command INFO REFCARD |
| Free forum by Nabble | Edit this page |
