Login  Register

Inter rater reliability - Fleiss's Kappa?

Posted by mcolmar on Sep 26, 2012; 5:11pm
URL: http://spssx-discussion.165.s1.nabble.com/Inter-rater-reliability-Fleiss-s-Kappa-tp5715293.html

Hi

I am a 4th year medical student, writing my first research paper and have very minimal (or zero) experience with statistics!

My project involves the creation of a behavior marking system which has 5 categories (eg leadership, communication etc). These categories are further broken down into subcategories to define the behavior (called  'elements'), the number of which is variable from 3-6 per category. The individuals are awarded a score of 1-5 for each element, with 5 representing excellent performance of the behavioral element. There are 3 separate raters.  

I am hoping to assess the inter rater reliability for each element. What is an appropriate measurement of this? I have done a little bit of research and it would suggest that Fleiss's kappa would be the best as there are 3 raters. Is this correct, and if so can I use the SPSSX / PASW software to do this?

Any help would be very much appreciated!

Thanks,
Maddy