Agreement Statistics for 2 Raters
immer_agree2.Rd
Some agreement statistics for two raters, including raw agreement, Scott's Pi, Cohen's Kappa, Gwets AC1 and Aickens Alpha (see Gwet, 2010).
Arguments
- y
Data frame with responses for two raters
- w
Optional vector of frequency weights
- symmetrize
Logical indicating whether contingency table should be symmetrized
- tol
Vector of integers indicating tolerance for raw agreement
- object
Object of class
immer_agree2
- digits
Number of digits after decimal for rounding
- ...
Further arguments to be passed
Value
List with entries
- agree_raw
Raw agreement
- agree_stats
Agreement statistics
- agree_table
Contingency table
- marg
Marginal frequencies
- Pe
Expected chance agreement probabilities
- PH
Probabilities for hard-to-classify subjects according to Aicken
- nobs
Number of observations
References
Gwet, K. L. (2010). Handbook of inter-rater reliability. Gaithersburg: Advanced Analytics.
See also
For more inter-rater agreement statistics see the R packages agRee, Agreement, agrmt, irr, obs.agree, rel.
Examples
#############################################################################
# EXAMPLE 1: Dataset in Schuster & Smith (2006)
#############################################################################
data(data.immer08)
dat <- data.immer08
y <- dat[,1:2]
w <- dat[,3]
# agreement statistics
res <- immer::immer_agree2( y=y, w=w )
summary(res)
# extract some output values
res$agree_stats
res$agree_raw