Quantifying the impact of partial measurement invariance in diagnostic research: An application to addiction research

Citation metadata

Date: July 2019
From: Addictive Behaviors(Vol. 94)
Publisher: Elsevier B.V.
Document Type: Report
Length: 335 words

Document controls

Main content

Abstract :

Keywords Measurement invariance; Diagnostic accuracy; Alcohol use disorder; Practical significance; Addiction research Highlights * Noninvariant measurement can bias prevalence rate estimates in subgroups * Diagnostic accuracy analysis quantifies practical significance of partial invariance * Noninvariant alcohol dependence items across age did not affect diagnostic accuracy * The diagnostic accuracy analysis can be easily implemented with the R program Abstract Establishing measurement invariance, or that an instrument measures the same construct(s) in the same way across subgroups of respondents, is crucial in efforts to validate social and behavioral instruments. Although substantial previous research has focused on detecting the presence of noninvariance, less attention has been devoted to its practical significance and even less has been paid to its possible impact on diagnostic accuracy. In this article, we draw additional attention to the importance of measurement invariance and advance diagnostic research by introducing a novel approach for quantifying the impact of noninvariance with binary items (e.g., the presence or absence of symptoms). We illustrate this approach by testing measurement invariance and evaluating diagnostic accuracy across age groups using DSM alcohol use disorder items from a public national data set. By providing researchers with an easy-to-implement R program for examining diagnostic accuracy with binary items, this article sets the stage for future evaluations of the practical significance of partial invariance. Future work can extend our framework to include ordinal and categorical indicators, other measurement models in item response theory, settings with three or more groups, and via comparison to an external, "gold-standard" validator. Author Affiliation: (a) Department of Psychology, University of Southern California, Los Angeles, CA, USA (b) School of Human Services, University of Cincinnati, Cincinnati, OH, USA (c) Department of Human Development and Family Studies, The Pennsylvania State University, University Park, PA, USA * Corresponding author at: Department of Psychology, University of Southern California, Los Angeles, CA 90089-1061, USA. Article History: Received 1 May 2018; Revised 20 September 2018; Accepted 19 November 2018 Byline: Mark H.C. Lai [hokchiol@usc.edu] (a,*), George B. Richardson (b), Hio Wa Mak (c)

Source Citation

Source Citation   

Gale Document Number: GALE|A586899678