![AgreeStat/360: computing weighted agreement coefficients (Conger's kappa, Fleiss' kappa, Gwet's AC1/AC2, Krippendorff's alpha, and more) for 3 raters or more AgreeStat/360: computing weighted agreement coefficients (Conger's kappa, Fleiss' kappa, Gwet's AC1/AC2, Krippendorff's alpha, and more) for 3 raters or more](https://www.agreestat.com/examples/pictures/cac_data_3raters_raw.png)
AgreeStat/360: computing weighted agreement coefficients (Conger's kappa, Fleiss' kappa, Gwet's AC1/AC2, Krippendorff's alpha, and more) for 3 raters or more
![Fleiss' Kappa for the agreement. Each bar represents the agreement on... | Download Scientific Diagram Fleiss' Kappa for the agreement. Each bar represents the agreement on... | Download Scientific Diagram](https://www.researchgate.net/publication/355584496/figure/fig2/AS:1083034673655815@1635226996087/Fleiss-Kappa-for-the-agreement-Each-bar-represents-the-agreement-on-an-item.png)
Fleiss' Kappa for the agreement. Each bar represents the agreement on... | Download Scientific Diagram
The Equivalence of Weighted Kappa and the Intraclass Correlation Coefficient as Measures of Reliability - Joseph L. Fleiss, Jacob Cohen, 1973
![Multi-Class Metrics Made Simple, Part III: the Kappa Score (aka Cohen's Kappa Coefficient) | by Boaz Shmueli | Towards Data Science Multi-Class Metrics Made Simple, Part III: the Kappa Score (aka Cohen's Kappa Coefficient) | by Boaz Shmueli | Towards Data Science](https://miro.medium.com/v2/resize:fit:1258/0*xoNLU_pV4uLzpAWp.png)