interrater agreement

  1. L

    Agreement between raters / inter-rater reliability : Dilemma

    Hi, I have done a test where 2 different raters had to rate 1000 texts independently according to the following options: - Happy - Sad - Angry - Confused - Could not tell Now, I am trying to determine their agreement (i.e reliability). However, I am facing a dilemma -specially when the do...
  2. L

    I need help with Inter-Rater agreement

    I'm developing a scale and my items were already checked by a group of 4 judges. They gave a dichotomica response (yes, leave the item or no, remove it). Now what should I do? I know I have to calculate something named "inter rater agreement", but I don't know how and I don't know which...
  3. S

    Intracorrelation Coefficient producing unexpected results..can you help me understand

    Hi, So i am trying to measure the reliability of measurements taken with a callipers by 4 different users on the same 10 samples. This my data in millimeters: Barry Sarah Aoife Jen Sample 1 2.18 2.15 2.27 1.62 2 1.695 1.82 2.07 1.33 3 1.76 1.46 2.20 1.18 4 1.83 1.94 3.00 1.51 5...
  4. R

    inter-rater agreement: uses of kappa, pearson's, and intraclass coefficients

    Hi everyone: I really appreciate all of the great posts on here! I'm trying to wrap my head around a couple of things. First of all, a bit of background: I am trying to "validate" a questionnaire that was developed against a "gold standard": medical records. The variables are all...