Hello All,
I have 2 raters (S & J) who rate schools on 14 attributes. Each attribute is rated on a dichotomous scale. I know that I can look at the kappa for each attribute (S1 and J1, S2 and J2, and so forth) but is there any way to account for the fact that I am doing this 14 times? Or a way to summarize overall agreement? I found the Kupper and Hafner (1989) paper describing what to do with more than one nominal variable being rated, and I am hoping there is an SPSS solution that may be similar(?) If there isn't a answer, what would you suggest for reporting the range of kappas across attributes? Thanks, Susan |
Administrator
|
Maybe you need to be more forthcoming about your data!
From what it sounds like you simply have 28 values (2 raters x 14 attributes)? I don't see how a Kappa is calculated for 2 raters for even a single attribute! Please share with the class!
Please reply to the list and not to my personal email.
Those desiring my consulting or training services please feel free to email me. --- "Nolite dare sanctum canibus neque mittatis margaritas vestras ante porcos ne forte conculcent eas pedibus suis." Cum es damnatorum possederunt porcos iens ut salire off sanguinum cliff in abyssum?" |
Sorry, David, let me try again.
There are two raters (S & J). There are 12 schools being rated (1-12). Each school is being rated dichotomously on 14 attributes (did/did not use Strategy 1, did/did not use Strategy 2, and so forth). As you might expect (as would be typical for Kappa), I have the schools set as the cases (rows), and I have the raters set as the variables, also by strategy (columns). So variable 1 (V1) is S1 (rater S strategy 1) and V2 is J1 (rater J strategy 1), V3 is S2, and V4 is J2, etc. Each rater rates each school on each attribute, so I have no missing data. If I only had one strategy, this would be simple. But I have 14, and while I can look at the IRR for each strategy, it doesn't take into account the multivariate nature of the data. I was hoping the forum would have some insight. If my description above is still lacking, I can post a sample table. But I hope this will be enough to get the conversation started. Thanks for any help! |
Susan,
It seems to me as though your approach will simply provide an overall kappa. Think of your strategies as 14 different measures. If that we're the case you'd want to know for each measure the extent to which taters agreed on each measure. So...first for overall agreement, stack your data so that there are two columns, with eater S I'm one and rarer J in the other, with 168 lines, 12 schools x 14 strategies. This will provide an overall kappa for all taters for all data. Then run each strategy separately. I know that's a pain, but you'll be able to identify which strategies, if any, are giving raters difficulty. Kappa is not set up, either in the point and click format or any macro available, to provide information on both category agreement and individual measure (or in your case strategy) agreement. Brian Sent from my iPhone > On Aug 8, 2014, at 8:37 AM, "susan.richardson" <[hidden email]> wrote: > > Sorry, David, let me try again. > > There are two raters (S & J). > > There are 12 schools being rated (1-12). > > Each school is being rated dichotomously on 14 attributes (did/did not use > Strategy 1, did/did not use Strategy 2, and so forth). > > As you might expect (as would be typical for Kappa), I have the schools set > as the cases (rows), and I have the raters set as the variables, also by > strategy (columns). So variable 1 (V1) is S1 (rater S strategy 1) and V2 is > J1 (rater J strategy 1), V3 is S2, and V4 is J2, etc. > > Each rater rates each school on each attribute, so I have no missing data. > > If I only had one strategy, this would be simple. But I have 14, and while I > can look at the IRR for each strategy, it doesn't take into account the > multivariate nature of the data. I was hoping the forum would have some > insight. > > If my description above is still lacking, I can post a sample table. But I > hope this will be enough to get the conversation started. > > Thanks for any help! > > > > > -- > View this message in context: http://spssx-discussion.1045642.n5.nabble.com/inter-rater-reliability-multiple-attributes-tp5726913p5726916.html > Sent from the SPSSX Discussion mailing list archive at Nabble.com. > > ===================== > To manage your subscription to SPSSX-L, send a message to > [hidden email] (not to SPSSX-L), with no body text except the > command. To leave the list, send the command > SIGNOFF SPSSX-L > For a list of commands to manage subscriptions, send the command > INFO REFCARD ===================== To manage your subscription to SPSSX-L, send a message to [hidden email] (not to SPSSX-L), with no body text except the command. To leave the list, send the command SIGNOFF SPSSX-L For a list of commands to manage subscriptions, send the command INFO REFCARD |
Free forum by Nabble | Edit this page |