cohen's kappa

  1. M

    kappa-value comparison

    Can I compare a kappa-value of a binary data set and one of an ordinal data set? And does it change anything?
  2. C

    K Cohen interreader agreement formula

    Hello Is the formula to calculate the K Cohen the same for multiple categorical variables than for a 2x2 table ? Thank you
  3. M

    What happens after Content Analysis and Cohen's Kappa?

    Hi I have 2 raters for content analysis and have a satisfactory score for Cohen's Kappa (0.805). Can I use either raters' values for analysis or do I use the average of the 2 raters? Thanks in advance for your help! Regards, Marion
  4. M

    Which value do I use for Content Analysis after intercoder agreement is satisfactory

    Hi I have 2 raters for content analysis and have a satisfactory score for Cohen's Kappa (0.805). Can I use either raters' values for analysis or do I use the average of the 2 raters? Thanks in advance for your help! Regards, Marion
  5. H

    Test-Retest Reliability Coefficient

    Hi Folks, Hopefully this is a relatively easy question. I collected data on a measure at two time points, and I would like to run a test-retest reliability check. Is a Pearson correlation coefficient the one to use, or are there other thoughts on this? Many thanks for your time!
  6. R

    Weighted Kappas...please help!

    Hello! I've looked all over regarding how to do this but am still coming up short. Please help! I have 16 data points. I have provided the weights!! Please help! This is huge!!!! Data Point Weight Component 1 .33 A 2 .66...
  7. M

    How to calculate inter-observer/ inter-rater reliability of ordinal data in R?

    Hello, I have the question which measurement to use in order to calculate the inter-observer reliability in R once for an ordinal and once for a nominal data set. Part1) Ordinal data set 2 observers gave scores on a scale from 1.0 to 5.0 with steps of 0.25 Example of data: observ1...
  8. U

    interrater reliability help?

    Dear all, I'm hoping for some direction in choosing between cohen's kappa and ICC for my interrater reliability. I have 2 coders rating the presence of expressions 1 or 0 (present or not). I'm leaning towards cohen's kappa, but I don't totally understand the relevant difference between the...