Classification Performance of Answer-Copying Indices Under Different Types of IRT Models

Research output: Contribution to journalArticle

1 Scopus citations

Abstract

Test fraud has recently received increased attention in the field of educational testing, and the use of comprehensive integrity analysis after test administration is recommended for investigating different types of potential test frauds. One type of test fraud involves answer copying between two examinees, and numerous statistical methods have been proposed in the literature to screen and identify unusual response similarity or irregular response patterns on multiple-choice tests. The current study examined the classification performance of answer-copying indices measured by the area under the receiver operating characteristic (ROC) curve under different item response theory (IRT) models (one- [1PL], two- [2PL], three-parameter [3PL] models, nominal response model [NRM]) using both simulated and real response vectors. The results indicated that although there is a slight increase in the performance for low amount of copying conditions (20%), when nominal response outcomes were used, these indices performed in a similar manner for 40% and 60% copying conditions when dichotomous response outcomes were utilized. The results also indicated that the performance with simulated response vectors was almost identically reproducible with real response vectors.

Original languageEnglish (US)
Pages (from-to)592-607
Number of pages16
JournalApplied Psychological Measurement
Volume40
Issue number8
DOIs
StatePublished - Nov 1 2016

    Fingerprint

Keywords

  • answer copying
  • item response theory
  • person fit
  • response similarity
  • test score integrity
  • test security

ASJC Scopus subject areas

  • Social Sciences (miscellaneous)
  • Psychology (miscellaneous)

Cite this