Login  Register

Interrater reliability

Posted by Lovins, Brian (lovinsbk) on Aug 03, 2011; 6:41pm
URL: http://spssx-discussion.165.s1.nabble.com/Interrater-reliability-tp4663573.html

Good afternoon,

I am looking to calculate Kappa to determine a measure of interrater reliability. I currently have 125 subjects being rated by different pairs of staff. Each pair assesses the same person, but there are 125 different pairs of staff. I want to calculate the overall Kappa for the entire group. I can do it for the individual pairs and average the scores, but was hoping there was a syntax/macro that I could use that would calculate the overall Kappa. The data are formatted as follows, but I can restructure the data if needed:

 

Rater1a

Rater1b

Rater2a

Rater2b

Item 1-Subject 1

Item 1-Subject 1

Item 1-Subject 2

Item 1 – Subject 2

Item 2 – Subject 1

Item 2 – subject 1

Item 2-Subject 2

Item 2 – Subject 2

 

 

Thanks

Brian