Re: Interrater reliability
Posted by
Bruce Weaver on
Aug 03, 2011; 8:17pm
URL: http://spssx-discussion.165.s1.nabble.com/Interrater-reliability-tp4663573p4663914.html
One of my former bosses would no doubt urge you to consider using G-Theory. You can read about it in chapter 9 of Health Measurement Scales. Here's the Google Books link:
http://books.google.ca/books?id=UbKijeRqndwC&printsec=frontcover&dq=health+measurement+scales+streiner&hl=en&ei=bas5TuDDHorMsQKDvsg0&sa=X&oi=book_result&ct=result&resnum=1&ved=0CDAQ6AEwAA#v=onepage&q&f=falseAs you'll see there, Norman & Streiner recommend a (free) program called G_String (
http://fhsperd.mcmaster.ca/g_string/index.html).
HTH.
Lovins, Brian (lovinsbk) wrote
Good afternoon,
I am looking to calculate Kappa to determine a measure of interrater reliability. I currently have 125 subjects being rated by different pairs of staff. Each pair assesses the same person, but there are 125 different pairs of staff. I want to calculate the overall Kappa for the entire group. I can do it for the individual pairs and average the scores, but was hoping there was a syntax/macro that I could use that would calculate the overall Kappa. The data are formatted as follows, but I can restructure the data if needed:
--- Table snipped ---
Thanks
Brian
--
Bruce Weaver
bweaver@lakeheadu.ca
http://sites.google.com/a/lakeheadu.ca/bweaver/"When all else fails, RTFM."
PLEASE NOTE THE FOLLOWING:
1. My Hotmail account is not monitored regularly. To send me an e-mail, please use the address shown above.
2. The SPSSX Discussion forum on Nabble is no longer linked to the SPSSX-L listserv administered by UGA (
https://listserv.uga.edu/).