![AgreeStat/360: computing weighted agreement coefficients (Conger's kappa, Fleiss' kappa, Gwet's AC1/AC2, Krippendorff's alpha, and more) for 3 raters or more AgreeStat/360: computing weighted agreement coefficients (Conger's kappa, Fleiss' kappa, Gwet's AC1/AC2, Krippendorff's alpha, and more) for 3 raters or more](https://www.agreestat.com/examples/pictures/cac_data_3raters_raw.png)
AgreeStat/360: computing weighted agreement coefficients (Conger's kappa, Fleiss' kappa, Gwet's AC1/AC2, Krippendorff's alpha, and more) for 3 raters or more
![Inter-rater agreement measured using Cohen's Kappa and Krippendorff's... | Download Scientific Diagram Inter-rater agreement measured using Cohen's Kappa and Krippendorff's... | Download Scientific Diagram](https://www.researchgate.net/publication/323285265/figure/fig3/AS:941695235539000@1601529048659/Inter-rater-agreement-measured-using-Cohens-Kappa-and-Krippendorffs-Alpha-in-both.gif)
Inter-rater agreement measured using Cohen's Kappa and Krippendorff's... | Download Scientific Diagram
![Symmetry | Free Full-Text | An Empirical Comparative Assessment of Inter-Rater Agreement of Binary Outcomes and Multiple Raters Symmetry | Free Full-Text | An Empirical Comparative Assessment of Inter-Rater Agreement of Binary Outcomes and Multiple Raters](https://www.mdpi.com/symmetry/symmetry-14-00262/article_deploy/html/images/symmetry-14-00262-g001.png)
Symmetry | Free Full-Text | An Empirical Comparative Assessment of Inter-Rater Agreement of Binary Outcomes and Multiple Raters
![K. Gwet's Inter-Rater Reliability Blog : Benchmarking Agreement CoefficientsInter-rater reliability: Cohen kappa, Gwet AC1/AC2, Krippendorff Alpha K. Gwet's Inter-Rater Reliability Blog : Benchmarking Agreement CoefficientsInter-rater reliability: Cohen kappa, Gwet AC1/AC2, Krippendorff Alpha](https://1.bp.blogspot.com/-8lLMKISEeRo/VP2kWbXou8I/AAAAAAAAIFY/8kbySM4sPPM/w1200-h630-p-k-no-nu/altman_benchmark_scale.jpg)
K. Gwet's Inter-Rater Reliability Blog : Benchmarking Agreement CoefficientsInter-rater reliability: Cohen kappa, Gwet AC1/AC2, Krippendorff Alpha
![AgreeStat/360: computing weighted agreement coefficients from a contingency table (Cohen's kappa, Gwet's AC1/AC2, Krippendorff's alpha, and more) AgreeStat/360: computing weighted agreement coefficients from a contingency table (Cohen's kappa, Gwet's AC1/AC2, Krippendorff's alpha, and more)](https://www.agreestat.com/examples/pictures/cac_output_contingency_weighted.png)
AgreeStat/360: computing weighted agreement coefficients from a contingency table (Cohen's kappa, Gwet's AC1/AC2, Krippendorff's alpha, and more)
![Measuring inter-rater reliability for nominal data – which coefficients and confidence intervals are appropriate? | BMC Medical Research Methodology | Full Text Measuring inter-rater reliability for nominal data – which coefficients and confidence intervals are appropriate? | BMC Medical Research Methodology | Full Text](https://media.springernature.com/lw685/springer-static/image/art%3A10.1186%2Fs12874-016-0200-9/MediaObjects/12874_2016_200_Fig4_HTML.gif)
Measuring inter-rater reliability for nominal data – which coefficients and confidence intervals are appropriate? | BMC Medical Research Methodology | Full Text
![AgreeStat/360: computing agreement coefficients from a contingency table (Cohen's kappa, Gwet's AC1/AC2, Krippendorff's alpha, and more) AgreeStat/360: computing agreement coefficients from a contingency table (Cohen's kappa, Gwet's AC1/AC2, Krippendorff's alpha, and more)](https://www.agreestat.com/examples/pictures/cac_output_contingency.png)