Common Contracts

1 similar null contracts

Benchmarking Kappa: Interrater Agreement in Software Process Assessments
October 22nd, 1999
  • Filed
    October 22nd, 1999

Abstract. Software process assessments are by now a prevalent tool for process improvement and contract risk assessment in the software industry. Given that scores are assigned to processes during an assessment, a process assessment can be considered a subjective measurement procedure. As with any subjective measurement procedure, the reliability of process assessments has important implications on the utility of assessment scores, and therefore the reliability of assessments can be taken as a criterion for evaluating an assessment’s quality. The particular type of reliability of interest in this paper is interrater agreement. Thus far, empirical evaluations of the interrater agreement of assessments have used Cohen’s Kappa coefficient. Once a Kappa value has been derived, the next question is “how good is it?” Benchmarks for interpreting the obtained values of Kappa are available from the social sciences and medical literature. However, the applicability of these benchmarks to the sof

AutoNDA by SimpleDocs
Time is Money Join Law Insider Premium to draft better contracts faster.