Re: Measures of agreement for individuals categories when the cat egories are ordinal

classic Classic list List threaded Threaded
1 message Options
Reply | Threaded
Open this post in threaded view
|

Re: Measures of agreement for individuals categories when the cat egories are ordinal

bdates
Margaret,

Gwet's AC1, Agreement Coefficient 1, takes the degree of disagreement into
consideration when calculating the Pe, probability of chance agreement.  The
formula is essentially the sum across all categories of (marginal proportion
of category use x (1 - marginal proportion of category use)).  In the case
of one category, then there obviously would be no need to sum.  The result
of the preceding calculation is then multiplied by 1/(q-1), where q is the
number of categories.  Krippendorf's alpha is another accepted statistic
that is built on degree of disagreement, but I'm not aware of its use with
single categories.

Brian


> Dear all
>
>   Apologies for cross-posting
>
>   I have been using an intra-class correlation coefficient to analyse my
> data which is on an ordinal scale from 1 to 7. The analysis involves a
> two-way mixed effects model in which overall absolute agreement is being
> measured. I would like to complement the results to date with further
> results relating to the level of agreement for each category individually
> (under the assumption that there are two raters).  As I understand from my
> reading, there are a number of definitions of Kappa statistics which allow
> for the assessment of chance-corrected inter-rater agreement over grade A
> only, say.  However, it appears that the related calculations involve the
> assumption that there are only two categories (in the above example:
> 'grade A' or 'other grade').  The generalization 'other grade' removes the
> capacity to assess the extent to which individual examiners disagree on an
> ordinal scale when one examiner assings the grade A but the other does
> not.  I wonder therefore if anyone is
>  aware of alternative chance-corrected approaches to assessing agreement
> between two raters for a single category whereby whenever the raters
> disagree, the extent of disagreement is taken into consideration.
>
>   I look forward to being educated!
>
>   Best wishes
>
>   Margaret
>
>
>
>
> ---------------------------------
>  Try the all-new Yahoo! Mail . "The New Version is radically easier to
> use" - The Wall Street Journal
>
>
Confidentiality Notice for Email Transmissions: The information in this
message is confidential and may be legally privileged. It is intended solely
for the addressee.  Access to this message by anyone else is unauthorised.
If you are not the intended recipient, any disclosure, copying, or
distribution of the message, or any action or omission taken by you in
reliance on it, is prohibited and may be unlawful.  Please immediately
contact the sender if you have received this message in error. Thank you.