Kappa in SPSS

classic Classic list List threaded Threaded
21 messages Options
12
Reply | Threaded
Open this post in threaded view
|

Kappa in SPSS

KuharAna
Hello,

I performed a study with 4 raters, who rated 300 microscopic images. There
were 5 parameters to rate in each image:

1. percentage of reaction (values from 0 to 3)  
2. intensity (values from 0 to 3)
3. background stainin (values from 0 to 3)
4.morfology (0- not ok, 1-  ok)
5. contrast (0-not ok, 1 - ok)

(3=high value, 0= no reaction)

I wanted to do kappa - meaning kappa between rater A and B, kappa between
rater A and C , kappa between rater A and D etc.
But here i have a problem, because I have more than just one parameter. and
i am havinf difficulties on how to do it.


Can i join first three parameters (because they all have values form 0 to 3,
ordinal values), and the last two parameters (nominal values)?
The thing is, that i would like to have only one kappa between to raters and
not two. But i cant probably join all five parameters?
Is it possible that I can sum the valus of the five parameters and then
compare the summed values between two rates?

Any help will be appreciated.

Ana




--
Sent from: http://spssx-discussion.1045642.n5.nabble.com/

=====================
To manage your subscription to SPSSX-L, send a message to
[hidden email] (not to SPSSX-L), with no body text except the
command. To leave the list, send the command
SIGNOFF SPSSX-L
For a list of commands to manage subscriptions, send the command
INFO REFCARD
Reply | Threaded
Open this post in threaded view
|

Re: Kappa in SPSS

Rich Ulrich

Please consider, (1) What is your purpose?  and (2) Who is your audience for your information?


If these were my data, I imagine that I /first/ would be interested in reliability of each aspect, and for

each rater. Eventually, I might want a summary ... if everything does look fine enough. And, NO, it is just

about never proper to combine sets of scores on different measures merely because their ranges are the same.


The paired t-test gives both a measure of correlation and a test of systematic difference. That is to say - you

want to know that there is a good correlation between any two raters; and you want to know that they use

the scale with the same "anchors".  It is possible to be very consistent, and yet have a slight bias. The weighted

kappa (for continuous variables) is practically identical to the pearson r, anyway; for summary, report an average.

There is such a thing as a multi-rater kappa for dichotomies, but I can't say that I've ever been pleased to see one.


Especially for dichotomies, you want to look at the actual 2x2 distributions. You want to consider (and report)

"agreement" and "errors" when there is so much agreement that their is too little variability for useful tests

and correlations or kappas. (Expect product-moment correlations to be smaller for the dichotomies, by statistical artifact.)


If you do know now to compute something that is considered a useful composite score from your five variables,

you can perform the same testing on that composite.


Hope this helps.

--

Rich Ulrich


From: SPSSX(r) Discussion <[hidden email]> on behalf of KuharAna <[hidden email]>
Sent: Thursday, January 18, 2018 4:41:39 AM
To: [hidden email]
Subject: Kappa in SPSS
 
Hello,

I performed a study with 4 raters, who rated 300 microscopic images. There
were 5 parameters to rate in each image:

1. percentage of reaction (values from 0 to 3) 
2. intensity (values from 0 to 3)
3. background stainin (values from 0 to 3)
4.morfology (0- not ok, 1-  ok)
5. contrast (0-not ok, 1 - ok)

(3=high value, 0= no reaction)

I wanted to do kappa - meaning kappa between rater A and B, kappa between
rater A and C , kappa between rater A and D etc.
But here i have a problem, because I have more than just one parameter. and
i am havinf difficulties on how to do it.


Can i join first three parameters (because they all have values form 0 to 3,
ordinal values), and the last two parameters (nominal values)?
The thing is, that i would like to have only one kappa between to raters and
not two. But i cant probably join all five parameters?
Is it possible that I can sum the valus of the five parameters and then
compare the summed values between two rates?

Any help will be appreciated.

Ana




--
Sent from: http://spssx-discussion.1045642.n5.nabble.com/

=====================
To manage your subscription to SPSSX-L, send a message to
[hidden email] (not to SPSSX-L), with no body text except the
command. To leave the list, send the command
SIGNOFF SPSSX-L
For a list of commands to manage subscriptions, send the command
INFO REFCARD
===================== To manage your subscription to SPSSX-L, send a message to [hidden email] (not to SPSSX-L), with no body text except the command. To leave the list, send the command SIGNOFF SPSSX-L For a list of commands to manage subscriptions, send the command INFO REFCARD
Reply | Threaded
Open this post in threaded view
|

Re: Kappa in SPSS

KuharAna
Thanks for the quick reply.
My main purpose of my study is not really the evaluation of agreement
between raters however it is one of the first thing I want to do (I am doing
my masters degree).

So what would you recommend me to do. Kappa statistics for each individual
parameter?
What about intra-class correlation coeficient?




--
Sent from: http://spssx-discussion.1045642.n5.nabble.com/

=====================
To manage your subscription to SPSSX-L, send a message to
[hidden email] (not to SPSSX-L), with no body text except the
command. To leave the list, send the command
SIGNOFF SPSSX-L
For a list of commands to manage subscriptions, send the command
INFO REFCARD
Reply | Threaded
Open this post in threaded view
|

Re: Kappa in SPSS

Art Kendall
use crosstabs to visualize the pairs of coders for each DV.
Specimens are cases in this application.


Search the archives of this discussion list for "Krippendorf"
Those macros deal with inter/rater/coder/judge reliability.

For the various types of intra-class correlation search these archives for
"reliability" and "variance".





-----
Art Kendall
Social Research Consultants
--
Sent from: http://spssx-discussion.1045642.n5.nabble.com/

=====================
To manage your subscription to SPSSX-L, send a message to
[hidden email] (not to SPSSX-L), with no body text except the
command. To leave the list, send the command
SIGNOFF SPSSX-L
For a list of commands to manage subscriptions, send the command
INFO REFCARD
Art Kendall
Social Research Consultants
Reply | Threaded
Open this post in threaded view
|

Re: Kappa in SPSS

Mike
I agree with the pairwise crosstabs but I think it might be worthwhilte
to also get the percent correct or "perfect agreement" for the t measures.
If we use a strict traditional definition of agreement (i.e., exact matches
of responses) then perfect agreement would mean that the pattern
of 5 values for one coder should exactly match the pattern for another
coder.  It is unlikely that *all* coders would show such a pattern but
some like will and it might be useful to know how many do so.  Similarly,
there may be some coders who do NOT match the pattern for any case
for another coder but this too should be a small number (ideally no one
completely disagrees with another coder though if a coder is using a coding
schema that sysetmatics disagrees with the schema used by others, then
the response rate might be well below chance levels0.  Again, see the
number of exact matches for pair could provide some useful information.
Just a suggestion.

-Mike Palij
New York University
[hidden email]


On Fri, Jan 19, 2018 at 7:57 AM, Art Kendall <[hidden email]> wrote:
use crosstabs to visualize the pairs of coders for each DV.
Specimens are cases in this application.


Search the archives of this discussion list for "Krippendorf"
Those macros deal with inter/rater/coder/judge reliability.

For the various types of intra-class correlation search these archives for
"reliability" and "variance".
-----
Art Kendall
Social Research Consultants


===================== To manage your subscription to SPSSX-L, send a message to [hidden email] (not to SPSSX-L), with no body text except the command. To leave the list, send the command SIGNOFF SPSSX-L For a list of commands to manage subscriptions, send the command INFO REFCARD
Reply | Threaded
Open this post in threaded view
|

Re: Kappa in SPSS

Bruce Weaver
Administrator
In reply to this post by Rich Ulrich
According to Norman & Streiner (in their book, /Biostatistics - The Bare
Essentials/), weighted kappa with quadratic weights is exactly equal to the
most common form of intraclass correlation (ICC).  The following rather long
hyperlink should take you to the page where they say this.

https://books.google.ca/books?id=y4tWQl_8Ni8C&pg=PA258&lpg=PA258&dq=weighted+kappa+icc+equivalent+Norman+Streiner&source=bl&ots=oNeOiPDlFe&sig=xvaY1HW3FSKzpJtY--J-MzD9JwA&hl=en&sa=X&ved=0ahUKEwi5tYrql-TYAhUIHqwKHZuPDowQ6AEILzAB#v=onepage&q=weighted%20kappa%20icc%20equivalent%20Norman%20Streiner&f=false

HTH.


Rich Ulrich wrote
> --- snip ---
> The weighted kappa (for continuous variables) is practically identical to
> the pearson r, anyway; for summary, report an average.
>
> --- snip ---





-----
--
Bruce Weaver
[hidden email]
http://sites.google.com/a/lakeheadu.ca/bweaver/

"When all else fails, RTFM."

NOTE: My Hotmail account is not monitored regularly.
To send me an e-mail, please use the address shown above.

--
Sent from: http://spssx-discussion.1045642.n5.nabble.com/

=====================
To manage your subscription to SPSSX-L, send a message to
[hidden email] (not to SPSSX-L), with no body text except the
command. To leave the list, send the command
SIGNOFF SPSSX-L
For a list of commands to manage subscriptions, send the command
INFO REFCARD
--
Bruce Weaver
bweaver@lakeheadu.ca
http://sites.google.com/a/lakeheadu.ca/bweaver/

"When all else fails, RTFM."

PLEASE NOTE THE FOLLOWING: 
1. My Hotmail account is not monitored regularly. To send me an e-mail, please use the address shown above.
2. The SPSSX Discussion forum on Nabble is no longer linked to the SPSSX-L listserv administered by UGA (https://listserv.uga.edu/).
Reply | Threaded
Open this post in threaded view
|

Re: Kappa in SPSS

bdates
I'm piggy-backing on Bruce's post. First in 1973, Fleiss and Cohen published an article which established the equivalence of the intraclass correlation coefficient and weighted kappa. Second, in 1977 Hubert published a paper in which he discusses three definitions of agreement. Agreement statistics are based on the definition that agreement between any two rates counts as an agreement. So if there are more than two raters, agreement between any two constitutes an instance of agreement. All agreement statistics use this approach. Third, if you are using multiple scales, you must establish interrater reliability/agreement for each scale. For the three of your scales that are ordinal, you should use the intraclass correlation coefficient. You will need to decide which model you have used.

Model 1: Raters are a random sample from a specified population of raters, and each rater does not rate all subjects/objects. Therefore, each subject/object is rated by a potentially different set of raters.

Model 2: Raters are a random sample from a specified population of raters, and each rater rates each subject/object.

Model 3: Raters constitute the entire population of raters, and each rates each subject/object.

Most projects fit Model 3, but you'll have to decide if this is true for you.

You should choose the average measure coefficient, which is equivalent to Fleiss' kappa.

For the nominal scale use Fleiss' kappa.

Do not divide your raters into pairs unless you get non-significant results, in which case you may want to know which rater may have had difficulty. However, research does not typically report such investigations. They are for you to understand which of your raters, if any, may need additional training. I suspect since this is your thesis that your data are final, but if you're still in a training stage, it may help.

Brian
________________________________________
From: SPSSX(r) Discussion [[hidden email]] on behalf of Bruce Weaver [[hidden email]]
Sent: Friday, January 19, 2018 9:05 AM
To: [hidden email]
Subject: Re: Kappa in SPSS

According to Norman & Streiner (in their book, /Biostatistics - The Bare
Essentials/), weighted kappa with quadratic weights is exactly equal to the
most common form of intraclass correlation (ICC).  The following rather long
hyperlink should take you to the page where they say this.

https://books.google.ca/books?id=y4tWQl_8Ni8C&pg=PA258&lpg=PA258&dq=weighted+kappa+icc+equivalent+Norman+Streiner&source=bl&ots=oNeOiPDlFe&sig=xvaY1HW3FSKzpJtY--J-MzD9JwA&hl=en&sa=X&ved=0ahUKEwi5tYrql-TYAhUIHqwKHZuPDowQ6AEILzAB#v=onepage&q=weighted%20kappa%20icc%20equivalent%20Norman%20Streiner&f=false

HTH.


Rich Ulrich wrote
> --- snip ---
> The weighted kappa (for continuous variables) is practically identical to
> the pearson r, anyway; for summary, report an average.
>
> --- snip ---





-----
--
Bruce Weaver
[hidden email]
http://sites.google.com/a/lakeheadu.ca/bweaver/

"When all else fails, RTFM."

NOTE: My Hotmail account is not monitored regularly.
To send me an e-mail, please use the address shown above.

--
Sent from: http://spssx-discussion.1045642.n5.nabble.com/

=====================
To manage your subscription to SPSSX-L, send a message to
[hidden email] (not to SPSSX-L), with no body text except the
command. To leave the list, send the command
SIGNOFF SPSSX-L
For a list of commands to manage subscriptions, send the command
INFO REFCARD

=====================
To manage your subscription to SPSSX-L, send a message to
[hidden email] (not to SPSSX-L), with no body text except the
command. To leave the list, send the command
SIGNOFF SPSSX-L
For a list of commands to manage subscriptions, send the command
INFO REFCARD
Reply | Threaded
Open this post in threaded view
|

Re: Kappa in SPSS

Rich Ulrich

Brian,

Fine post, except for

"Do not divide your raters into pairs unless you get non-significant results, in which case you may want to know which rater may have had difficulty. "


We are properly concerned with "too many tests" when we are testing our main hypotheses. Detailed tests
after a non-significant overall test are wholly exploratory. 

We are properly careful when we look very closely at whatever-it-is that may have upset our
data collection, etc.; "too many tests" does not exist when looking for hazards -- test everything, and
be concerned by "trends" where they seem meaningful, even when "not significant".

--
Rich Ulrich
===================== To manage your subscription to SPSSX-L, send a message to [hidden email] (not to SPSSX-L), with no body text except the command. To leave the list, send the command SIGNOFF SPSSX-L For a list of commands to manage subscriptions, send the command INFO REFCARD
Reply | Threaded
Open this post in threaded view
|

Re: Kappa in SPSS

bdates
Rich,

Agreed. I should have been more inclusive. I've always thought it interesting that Fleiss set the direction for more elucidation of results by producing category-based kappas, but didn't to the same for rater-based kappas. I wouldn't be as interested in individual pairs as in what each rater looked like in terms of agreement with the other rater(s). Of course in the two-rater circumstance, it's not clear which rater may have had the difficulty, if not both. In my syntax, I produce a table which presents just that, the proportion of agreement each rater had with the other rater(s) on each category. So the result is both rater- and category-based.

Brian
________________________________________
From: Rich Ulrich [[hidden email]]
Sent: Friday, January 19, 2018 7:03 PM
To: [hidden email]; Dates, Brian
Subject: Re: Kappa in SPSS

Brian,

Fine post, except for

"Do not divide your raters into pairs unless you get non-significant results, in which case you may want to know which rater may have had difficulty. "

We are properly concerned with "too many tests" when we are testing our main hypotheses. Detailed tests
after a non-significant overall test are wholly exploratory.

We are properly careful when we look very closely at whatever-it-is that may have upset our
data collection, etc.; "too many tests" does not exist when looking for hazards -- test everything, and
be concerned by "trends" where they seem meaningful, even when "not significant".

--
Rich Ulrich

=====================
To manage your subscription to SPSSX-L, send a message to
[hidden email] (not to SPSSX-L), with no body text except the
command. To leave the list, send the command
SIGNOFF SPSSX-L
For a list of commands to manage subscriptions, send the command
INFO REFCARD
Reply | Threaded
Open this post in threaded view
|

Re: Kappa in SPSS

bdates
In reply to this post by KuharAna
Ana,

I don't know if you've been following the current thread on the SPSS listserv which involves Fleiss' kappa. The person who initiated the thread wants to compute all pairwise kappa's among 30 raters. David Marso posted a solution using a macro. I've adapted the macro to also compute the overall kappa, and in the macro call line included only rater1 to rater4. Here it is below in case you're still interested. You still will need to do each scale separately since you cannot do separate scales at the same time. You also still have the problem of deciding how to analyze the three variables that you scaled 0 to 3. Since they're ordinal, you could also use the ICC. Doing an ICC for each pair would require that you would need to copy this and insert the syntax for an ICC where the "Statistics kappa" in the crosstabs entry would be replaced with the syntax for an ICC. That would give you two syntaxes, one for ICC and one for kappa. Then again, you could just do all five scales as kappa's. Good luck.

/* Compute a kappa for all rater pairs.*/

DEFINE !AllPairsKappa(raters !CMDEND)
!LET !CPY = !raters
!DO !j1 !IN ( !raters)
!LET !CPY= !TAIL( !CPY)
!DO !j2 !IN (!CPY)
CROSSTABS TABLES !j1 BY !j2 /STATISTICS KAPPA.
!DOEND
!DOEND
!ENDDEFINE.
!AllPairsKappa raters=rater1 rater2 rater3 rater4 .

 /* Compute the overall kappa.*/

DATASET ACTIVATE DataSet1.
STATS FLEISS KAPPA VARIABLES=rater1 rater2 rater3 rater4
 /OPTIONS CILEVEL=95.





Brian
________________________________________
From: SPSSX(r) Discussion [[hidden email]] on behalf of KuharAna [[hidden email]]
Sent: Thursday, January 18, 2018 4:41 AM
To: [hidden email]
Subject: Kappa in SPSS

Hello,

I performed a study with 4 raters, who rated 300 microscopic images. There
were 5 parameters to rate in each image:

1. percentage of reaction (values from 0 to 3)
2. intensity (values from 0 to 3)
3. background stainin (values from 0 to 3)
4.morfology (0- not ok, 1-  ok)
5. contrast (0-not ok, 1 - ok)

(3=high value, 0= no reaction)

I wanted to do kappa - meaning kappa between rater A and B, kappa between
rater A and C , kappa between rater A and D etc.
But here i have a problem, because I have more than just one parameter. and
i am havinf difficulties on how to do it.


Can i join first three parameters (because they all have values form 0 to 3,
ordinal values), and the last two parameters (nominal values)?
The thing is, that i would like to have only one kappa between to raters and
not two. But i cant probably join all five parameters?
Is it possible that I can sum the valus of the five parameters and then
compare the summed values between two rates?

Any help will be appreciated.

Ana




--
Sent from: http://spssx-discussion.1045642.n5.nabble.com/

=====================
To manage your subscription to SPSSX-L, send a message to
[hidden email] (not to SPSSX-L), with no body text except the
command. To leave the list, send the command
SIGNOFF SPSSX-L
For a list of commands to manage subscriptions, send the command
INFO REFCARD

=====================
To manage your subscription to SPSSX-L, send a message to
[hidden email] (not to SPSSX-L), with no body text except the
command. To leave the list, send the command
SIGNOFF SPSSX-L
For a list of commands to manage subscriptions, send the command
INFO REFCARD
Reply | Threaded
Open this post in threaded view
|

Re: Kappa in SPSS

KuharAna
In reply to this post by Rich Ulrich
Thank you all for quick replies.
I did kappa for each parameter between two raters, and also Fleiss kappa for
each parameter for all raters.
Few more interesting questions... Why is Fleiss kappa ususally lower than
ICC(intraclass corelation)? Anyone know? At least in my experiences.

I know that Kappa method is used for categorical and nominal variables. Is
ICC supposed to be used only for quantitive (continuous) variables?

Again, i really appreciate the answers.




--
Sent from: http://spssx-discussion.1045642.n5.nabble.com/

=====================
To manage your subscription to SPSSX-L, send a message to
[hidden email] (not to SPSSX-L), with no body text except the
command. To leave the list, send the command
SIGNOFF SPSSX-L
For a list of commands to manage subscriptions, send the command
INFO REFCARD
Reply | Threaded
Open this post in threaded view
|

Re: Kappa in SPSS

bdates

Ana,


Fleiss' kappa was developed for nominal data. It is a measure of agreement, not reliability in the sense of a variable of ordinal or interval nature. Therefore it is generally lower than ICC because the ICC was developed for interval data, and has been found to be applicable to ordinal data. It is a form of correlation, not agreement. Weighted Fleiss' kappa is comparable to the ICC, not ordinary Fleiss' kappa for nominal data.


Brian Dates

From: SPSSX(r) Discussion <[hidden email]> on behalf of KuharAna <[hidden email]>
Sent: Thursday, February 1, 2018 7:09:03 PM
To: [hidden email]
Subject: Re: Kappa in SPSS
 
Thank you all for quick replies.
I did kappa for each parameter between two raters, and also Fleiss kappa for
each parameter for all raters.
Few more interesting questions... Why is Fleiss kappa ususally lower than
ICC(intraclass corelation)? Anyone know? At least in my experiences.

I know that Kappa method is used for categorical and nominal variables. Is
ICC supposed to be used only for quantitive (continuous) variables?

Again, i really appreciate the answers.




--
Sent from: http://spssx-discussion.1045642.n5.nabble.com/

=====================
To manage your subscription to SPSSX-L, send a message to
[hidden email] (not to SPSSX-L), with no body text except the
command. To leave the list, send the command
SIGNOFF SPSSX-L
For a list of commands to manage subscriptions, send the command
INFO REFCARD

===================== To manage your subscription to SPSSX-L, send a message to [hidden email] (not to SPSSX-L), with no body text except the command. To leave the list, send the command SIGNOFF SPSSX-L For a list of commands to manage subscriptions, send the command INFO REFCARD
Reply | Threaded
Open this post in threaded view
|

Re: Kappa in SPSS

Rich Ulrich
In reply to this post by KuharAna

ICC is for quantitative scores, kappa is not. When the Quantity is more than a dichotomy,

there is a difference. ICC will be higher if the misses are near-misses.


For kappa, every Miss is a Miss; you get zero credit for being close, whereas ICC does give

credit for being close (or, discredit for being far off).


--

Rich Ulrich


From: SPSSX(r) Discussion <[hidden email]> on behalf of KuharAna <[hidden email]>
Sent: Thursday, February 1, 2018 7:09:03 PM
To: [hidden email]
Subject: Re: Kappa in SPSS
 
Thank you all for quick replies.
I did kappa for each parameter between two raters, and also Fleiss kappa for
each parameter for all raters.
Few more interesting questions... Why is Fleiss kappa ususally lower than
ICC(intraclass corelation)? Anyone know? At least in my experiences.

I know that Kappa method is used for categorical and nominal variables. Is
ICC supposed to be used only for quantitive (continuous) variables?

Again, i really appreciate the answers.




--
Sent from: http://spssx-discussion.1045642.n5.nabble.com/

=====================
To manage your subscription to SPSSX-L, send a message to
[hidden email] (not to SPSSX-L), with no body text except the
command. To leave the list, send the command
SIGNOFF SPSSX-L
For a list of commands to manage subscriptions, send the command
INFO REFCARD
===================== To manage your subscription to SPSSX-L, send a message to [hidden email] (not to SPSSX-L), with no body text except the command. To leave the list, send the command SIGNOFF SPSSX-L For a list of commands to manage subscriptions, send the command INFO REFCARD
Reply | Threaded
Open this post in threaded view
|

Re: Kappa in SPSS

nina
In reply to this post by KuharAna
Hi Brian,

can you suggest any references that discuss the difference(s) between "similarity" and "correlation"? Of course, it makes intuitive sense that a strong covariation/correlation between the ratings of two raters might be based on data that are far from being similar in terms of their absolute levels. But my impression (I might err here...) is that this point is rarely discussed in the literature?

Best,
Nina

=====================
To manage your subscription to SPSSX-L, send a message to
[hidden email] (not to SPSSX-L), with no body text except the
command. To leave the list, send the command
SIGNOFF SPSSX-L
For a list of commands to manage subscriptions, send the command
INFO REFCARD
Reply | Threaded
Open this post in threaded view
|

Re: Kappa in SPSS

David Marso
Administrator
I just out of curiousity entered 'similarity vs correlation' into a search
engine and found probably three days of reading just from:
http://ramet.elte.hu/~podani/3-Distance,%20similarity.pdf
I have only browsed through it but looks interesting.

Correlation is but one type of similarity measure.  There are many many
similarity and distance measures, also may types of correlation measures.
Hopefully this link will be helpful.


nina wrote

> Hi Brian,
>
> can you suggest any references that discuss the difference(s) between
> "similarity" and "correlation"? Of course, it makes intuitive sense that a
> strong covariation/correlation between the ratings of two raters might be
> based on data that are far from being similar in terms of their absolute
> levels. But my impression (I might err here...) is that this point is
> rarely discussed in the literature?
>
> Best,
> Nina
>
> =====================
> To manage your subscription to SPSSX-L, send a message to

> LISTSERV@.UGA

>  (not to SPSSX-L), with no body text except the
> command. To leave the list, send the command
> SIGNOFF SPSSX-L
> For a list of commands to manage subscriptions, send the command
> INFO REFCARD





-----
Please reply to the list and not to my personal email.
Those desiring my consulting or training services please feel free to email me.
---
"Nolite dare sanctum canibus neque mittatis margaritas vestras ante porcos ne forte conculcent eas pedibus suis."
Cum es damnatorum possederunt porcos iens ut salire off sanguinum cliff in abyssum?"
--
Sent from: http://spssx-discussion.1045642.n5.nabble.com/

=====================
To manage your subscription to SPSSX-L, send a message to
[hidden email] (not to SPSSX-L), with no body text except the
command. To leave the list, send the command
SIGNOFF SPSSX-L
For a list of commands to manage subscriptions, send the command
INFO REFCARD
Please reply to the list and not to my personal email.
Those desiring my consulting or training services please feel free to email me.
---
"Nolite dare sanctum canibus neque mittatis margaritas vestras ante porcos ne forte conculcent eas pedibus suis."
Cum es damnatorum possederunt porcos iens ut salire off sanguinum cliff in abyssum?"
Reply | Threaded
Open this post in threaded view
|

Re: Kappa in SPSS

Bruce Weaver
Administrator
For some other chapters by the same author, go to:

   http://ramet.elte.hu/~podani/subindex.html

Then click on the Books link on the left, and scroll down to "Introduction
to the Exploration of Multivariate Biological Data".



David Marso wrote

> I just out of curiousity entered 'similarity vs correlation' into a search
> engine and found probably three days of reading just from:
> http://ramet.elte.hu/~podani/3-Distance,%20similarity.pdf
> I have only browsed through it but looks interesting.
>
> Correlation is but one type of similarity measure.  There are many many
> similarity and distance measures, also may types of correlation measures.
> Hopefully this link will be helpful.
>
>
> nina wrote
>> Hi Brian,
>>
>> can you suggest any references that discuss the difference(s) between
>> "similarity" and "correlation"? Of course, it makes intuitive sense that
>> a
>> strong covariation/correlation between the ratings of two raters might be
>> based on data that are far from being similar in terms of their absolute
>> levels. But my impression (I might err here...) is that this point is
>> rarely discussed in the literature?
>>
>> Best,
>> Nina
>>
>> =====================
>> To manage your subscription to SPSSX-L, send a message to
>
>> LISTSERV@.UGA
>
>>  (not to SPSSX-L), with no body text except the
>> command. To leave the list, send the command
>> SIGNOFF SPSSX-L
>> For a list of commands to manage subscriptions, send the command
>> INFO REFCARD
>
>
>
>
>
> -----
> Please reply to the list and not to my personal email.
> Those desiring my consulting or training services please feel free to
> email me.
> ---
> "Nolite dare sanctum canibus neque mittatis margaritas vestras ante porcos
> ne forte conculcent eas pedibus suis."
> Cum es damnatorum possederunt porcos iens ut salire off sanguinum cliff in
> abyssum?"
> --
> Sent from: http://spssx-discussion.1045642.n5.nabble.com/
>
> =====================
> To manage your subscription to SPSSX-L, send a message to

> LISTSERV@.UGA

>  (not to SPSSX-L), with no body text except the
> command. To leave the list, send the command
> SIGNOFF SPSSX-L
> For a list of commands to manage subscriptions, send the command
> INFO REFCARD





-----
--
Bruce Weaver
[hidden email]
http://sites.google.com/a/lakeheadu.ca/bweaver/

"When all else fails, RTFM."

NOTE: My Hotmail account is not monitored regularly.
To send me an e-mail, please use the address shown above.

--
Sent from: http://spssx-discussion.1045642.n5.nabble.com/

=====================
To manage your subscription to SPSSX-L, send a message to
[hidden email] (not to SPSSX-L), with no body text except the
command. To leave the list, send the command
SIGNOFF SPSSX-L
For a list of commands to manage subscriptions, send the command
INFO REFCARD
--
Bruce Weaver
bweaver@lakeheadu.ca
http://sites.google.com/a/lakeheadu.ca/bweaver/

"When all else fails, RTFM."

PLEASE NOTE THE FOLLOWING: 
1. My Hotmail account is not monitored regularly. To send me an e-mail, please use the address shown above.
2. The SPSSX Discussion forum on Nabble is no longer linked to the SPSSX-L listserv administered by UGA (https://listserv.uga.edu/).
Reply | Threaded
Open this post in threaded view
|

Re: Kappa in SPSS

bdates

Nina,


Podani's Chapter 3, which David sent a link to, is an excellent discussion on the difference between similarity and correlation, as well as a discussion of dissimilarity. He presents a very large number of statistics for the measurement of similarity and some of its correlational counterparts. He also describes software for the purpose of calculating similarity coefficients. There is some discussion of the notion of agreement as it relates to similarity.


Brian Dates

From: SPSSX(r) Discussion <[hidden email]> on behalf of Bruce Weaver <[hidden email]>
Sent: Monday, February 5, 2018 9:29:25 AM
To: [hidden email]
Subject: Re: Kappa in SPSS
 
For some other chapters by the same author, go to:

   http://ramet.elte.hu/~podani/subindex.html

Then click on the Books link on the left, and scroll down to "Introduction
to the Exploration of Multivariate Biological Data".



David Marso wrote
> I just out of curiousity entered 'similarity vs correlation' into a search
> engine and found probably three days of reading just from:
> http://ramet.elte.hu/~podani/3-Distance,%20similarity.pdf
> I have only browsed through it but looks interesting.
>
> Correlation is but one type of similarity measure.  There are many many
> similarity and distance measures, also may types of correlation measures.
> Hopefully this link will be helpful.
>
>
> nina wrote
>> Hi Brian,
>>
>> can you suggest any references that discuss the difference(s) between
>> "similarity" and "correlation"? Of course, it makes intuitive sense that
>> a
>> strong covariation/correlation between the ratings of two raters might be
>> based on data that are far from being similar in terms of their absolute
>> levels. But my impression (I might err here...) is that this point is
>> rarely discussed in the literature?
>>
>> Best,
>> Nina
>>
>> =====================
>> To manage your subscription to SPSSX-L, send a message to
>
>> LISTSERV@.UGA
>
>>  (not to SPSSX-L), with no body text except the
>> command. To leave the list, send the command
>> SIGNOFF SPSSX-L
>> For a list of commands to manage subscriptions, send the command
>> INFO REFCARD
>
>
>
>
>
> -----
> Please reply to the list and not to my personal email.
> Those desiring my consulting or training services please feel free to
> email me.
> ---
> "Nolite dare sanctum canibus neque mittatis margaritas vestras ante porcos
> ne forte conculcent eas pedibus suis."
> Cum es damnatorum possederunt porcos iens ut salire off sanguinum cliff in
> abyssum?"
> --
> Sent from: http://spssx-discussion.1045642.n5.nabble.com/
>
> =====================
> To manage your subscription to SPSSX-L, send a message to

> LISTSERV@.UGA

>  (not to SPSSX-L), with no body text except the
> command. To leave the list, send the command
> SIGNOFF SPSSX-L
> For a list of commands to manage subscriptions, send the command
> INFO REFCARD





-----
--
Bruce Weaver
[hidden email]
http://sites.google.com/a/lakeheadu.ca/bweaver/

"When all else fails, RTFM."

NOTE: My Hotmail account is not monitored regularly.
To send me an e-mail, please use the address shown above.

--
Sent from: http://spssx-discussion.1045642.n5.nabble.com/

=====================
To manage your subscription to SPSSX-L, send a message to
[hidden email] (not to SPSSX-L), with no body text except the
command. To leave the list, send the command
SIGNOFF SPSSX-L
For a list of commands to manage subscriptions, send the command
INFO REFCARD

===================== To manage your subscription to SPSSX-L, send a message to [hidden email] (not to SPSSX-L), with no body text except the command. To leave the list, send the command SIGNOFF SPSSX-L For a list of commands to manage subscriptions, send the command INFO REFCARD
Reply | Threaded
Open this post in threaded view
|

Re: Kappa in SPSS

bdates

An addendum to my last post. If anyone is interested, Marta Garcia-Granero developed macros for some similarity measures that are mentioned in Podani's chapter - Lin's Concordance, Bland-Altman, Passing-Bablok, and Deming. Her website is https://gjyp.nl/marta/ 


Brian Dates

From: Dates, Brian
Sent: Monday, February 5, 2018 10:19:37 AM
To: [hidden email]
Subject: Re: Kappa in SPSS
 

Nina,


Podani's Chapter 3, which David sent a link to, is an excellent discussion on the difference between similarity and correlation, as well as a discussion of dissimilarity. He presents a very large number of statistics for the measurement of similarity and some of its correlational counterparts. He also describes software for the purpose of calculating similarity coefficients. There is some discussion of the notion of agreement as it relates to similarity.


Brian Dates

From: SPSSX(r) Discussion <[hidden email]> on behalf of Bruce Weaver <[hidden email]>
Sent: Monday, February 5, 2018 9:29:25 AM
To: [hidden email]
Subject: Re: Kappa in SPSS
 
For some other chapters by the same author, go to:

   http://ramet.elte.hu/~podani/subindex.html

Then click on the Books link on the left, and scroll down to "Introduction
to the Exploration of Multivariate Biological Data".



David Marso wrote
> I just out of curiousity entered 'similarity vs correlation' into a search
> engine and found probably three days of reading just from:
> http://ramet.elte.hu/~podani/3-Distance,%20similarity.pdf
> I have only browsed through it but looks interesting.
>
> Correlation is but one type of similarity measure.  There are many many
> similarity and distance measures, also may types of correlation measures.
> Hopefully this link will be helpful.
>
>
> nina wrote
>> Hi Brian,
>>
>> can you suggest any references that discuss the difference(s) between
>> "similarity" and "correlation"? Of course, it makes intuitive sense that
>> a
>> strong covariation/correlation between the ratings of two raters might be
>> based on data that are far from being similar in terms of their absolute
>> levels. But my impression (I might err here...) is that this point is
>> rarely discussed in the literature?
>>
>> Best,
>> Nina
>>
>> =====================
>> To manage your subscription to SPSSX-L, send a message to
>
>> LISTSERV@.UGA
>
>>  (not to SPSSX-L), with no body text except the
>> command. To leave the list, send the command
>> SIGNOFF SPSSX-L
>> For a list of commands to manage subscriptions, send the command
>> INFO REFCARD
>
>
>
>
>
> -----
> Please reply to the list and not to my personal email.
> Those desiring my consulting or training services please feel free to
> email me.
> ---
> "Nolite dare sanctum canibus neque mittatis margaritas vestras ante porcos
> ne forte conculcent eas pedibus suis."
> Cum es damnatorum possederunt porcos iens ut salire off sanguinum cliff in
> abyssum?"
> --
> Sent from: http://spssx-discussion.1045642.n5.nabble.com/
>
> =====================
> To manage your subscription to SPSSX-L, send a message to

> LISTSERV@.UGA

>  (not to SPSSX-L), with no body text except the
> command. To leave the list, send the command
> SIGNOFF SPSSX-L
> For a list of commands to manage subscriptions, send the command
> INFO REFCARD





-----
--
Bruce Weaver
[hidden email]
http://sites.google.com/a/lakeheadu.ca/bweaver/

"When all else fails, RTFM."

NOTE: My Hotmail account is not monitored regularly.
To send me an e-mail, please use the address shown above.

--
Sent from: http://spssx-discussion.1045642.n5.nabble.com/

=====================
To manage your subscription to SPSSX-L, send a message to
[hidden email] (not to SPSSX-L), with no body text except the
command. To leave the list, send the command
SIGNOFF SPSSX-L
For a list of commands to manage subscriptions, send the command
INFO REFCARD

===================== To manage your subscription to SPSSX-L, send a message to [hidden email] (not to SPSSX-L), with no body text except the command. To leave the list, send the command SIGNOFF SPSSX-L For a list of commands to manage subscriptions, send the command INFO REFCARD
Reply | Threaded
Open this post in threaded view
|

Re: Kappa in SPSS

Art Kendall
In reply to this post by David Marso
see the documentation for PROXIMITIES.

IIRC there are 30-some measures of distance/similarity etc. among
entities/cases.
This does not include difference/distance/similarity measures for strings.

Many of the concepts behind these can also be used to look at
responses/values in variables.

Bdates has given some great info from a psychometric perspective.

If you search the archive of this list for Krippendorf
You can see how he deals with measures when the cases are pieces of text
judges/rated/coded by several people.







-----
Art Kendall
Social Research Consultants
--
Sent from: http://spssx-discussion.1045642.n5.nabble.com/

=====================
To manage your subscription to SPSSX-L, send a message to
[hidden email] (not to SPSSX-L), with no body text except the
command. To leave the list, send the command
SIGNOFF SPSSX-L
For a list of commands to manage subscriptions, send the command
INFO REFCARD
Art Kendall
Social Research Consultants
Reply | Threaded
Open this post in threaded view
|

Re: Kappa in SPSS

Rich Ulrich
In reply to this post by nina

For two raters, I've long preferred looking at (not the ICC but) the ordinary r (because

people are used to the size of it) along with a paired t-test to check any difference.

[For 2x2 data, that could be kappa and Kendall's test for changes.]


For the data I usually dealt with ... For 3 or more raters, I looked at them in pairs.

My emphasis is right, I think, whenever you are developing on your own ad-hoc scales.


However, for the summaries that get published, or for people using the scales

developed by others, what is required (publication) or sufficient (cross-check)

is an overall number like the ICC.


I think that you are right, if you are suggesting that much literature on reliability makes it

easy for people to overlook or to forget the possible complications of differences in level.


--

Rich Ulrich


From: SPSSX(r) Discussion <[hidden email]> on behalf of Nina Lasek <[hidden email]>
Sent: Monday, February 5, 2018 4:57:46 AM
To: [hidden email]
Subject: Re: Kappa in SPSS
 
Hi Brian,

can you suggest any references that discuss the difference(s) between "similarity" and "correlation"? Of course, it makes intuitive sense that a strong covariation/correlation between the ratings of two raters might be based on data that are far from being similar in terms of their absolute levels. But my impression (I might err here...) is that this point is rarely discussed in the literature?

Best,
Nina

=====================
To manage your subscription to SPSSX-L, send a message to
[hidden email] (not to SPSSX-L), with no body text except the
command. To leave the list, send the command
SIGNOFF SPSSX-L
For a list of commands to manage subscriptions, send the command
INFO REFCARD
===================== To manage your subscription to SPSSX-L, send a message to [hidden email] (not to SPSSX-L), with no body text except the command. To leave the list, send the command SIGNOFF SPSSX-L For a list of commands to manage subscriptions, send the command INFO REFCARD
12