Login  Register

Re: Interrater reliability

Posted by Ryan on Aug 05, 2011; 2:01am
URL: http://spssx-discussion.165.s1.nabble.com/Interrater-reliability-tp4663573p4668398.html

What types of ratings were made?

Ryan

On Wed, Aug 3, 2011 at 2:41 PM, Lovins, Brian (lovinsbk) <[hidden email]> wrote:

Good afternoon,

I am looking to calculate Kappa to determine a measure of interrater reliability. I currently have 125 subjects being rated by different pairs of staff. Each pair assesses the same person, but there are 125 different pairs of staff. I want to calculate the overall Kappa for the entire group. I can do it for the individual pairs and average the scores, but was hoping there was a syntax/macro that I could use that would calculate the overall Kappa. The data are formatted as follows, but I can restructure the data if needed:

 

Rater1a

Rater1b

Rater2a

Rater2b

Item 1-Subject 1

Item 1-Subject 1

Item 1-Subject 2

Item 1 – Subject 2

Item 2 – Subject 1

Item 2 – subject 1

Item 2-Subject 2

Item 2 – Subject 2

 

 

Thanks

Brian