Do experimental and nonexperimental evaluations give different answers about the effectiveness of government-funded training programs?

David H. Greenberg, Charles Michalopoulos, Philip K. Robins

Research output: Contribution to journalReview article

20 Scopus citations

Abstract

This paper uses meta-analysis to investigate whether random assignment (or experimental) evaluations of voluntary government-funded training programs for the disadvantaged have produced different conclusions than nonexperimental evaluations. Information includes several hundred estimates from 31 evaluations of 15 programs that operated between 1964 and 1998. The results suggest that experimental and nonexperimental evaluations yield similar conclusions about the effectiveness of training programs, but that estimates of average effects for youth and possibly men might have been larger in experimental studies. The results also suggest that variation among nonexprimental estimates of program effects is similar to variation among experimental estimates for men and youth, but not for women (for whom it seems to be larger), although small sample sizes make the estimated differences somewhat imprecise for all three groups. The policy implications of the findings are discussed.

Original languageEnglish (US)
Pages (from-to)523-552
Number of pages30
JournalJournal of Policy Analysis and Management
Volume25
Issue number3
DOIs
StatePublished - Jun 1 2006

ASJC Scopus subject areas

  • Business, Management and Accounting(all)
  • Sociology and Political Science
  • Public Administration

Fingerprint Dive into the research topics of 'Do experimental and nonexperimental evaluations give different answers about the effectiveness of government-funded training programs?'. Together they form a unique fingerprint.

  • Cite this