Do Experimental and Nonexperimental Evaluations Give Different Answers about the Effectiveness of Government-Funded Training Programs?

David H. Greenberg, Charles Michalopoulos, Philip Robins

Research output: Chapter in Book/Report/Conference proceedingChapter

Abstract

This paper uses meta-analysis to investigate whether random assignment (or experimental) evaluations of voluntary government-funded training programs for the disadvantaged have produced different conclusions than nonexperimental evaluations. Information includes several hundred estimates from 31 evaluations of 15 programs that operated between 1964 and 1998. The results suggest that experimental and nonexperimental evaluations yield similar conclusions about the effectiveness of training programs, but that estimates of average effects for youth and possibly men might have been larger in experimental studies. The results also suggest that variation among nonexprimental estimates of program effects is similar to variation among experimental estimates for men and youth, but not for women (for whom it seems to be larger), although small sample sizes make the estimated differences somewhat imprecise for all three groups. The policy implications of the findings are discussed.

Original languageEnglish (US)
Title of host publicationSocial Experimentation, Program Evaluation, and Public Policy
PublisherBlackwell Publishing Ltd
Pages37-64
Number of pages28
ISBN (Print)9781405193931
DOIs
StatePublished - Apr 21 2009

Fingerprint

government program
training program
evaluation
Group

Keywords

  • Data
  • Descriptive Findings
  • Method
  • Regression Results
  • Some Shortcomings of the Descriptive Findings
  • Variation in Impacts

ASJC Scopus subject areas

  • Social Sciences(all)

Cite this

Greenberg, D. H., Michalopoulos, C., & Robins, P. (2009). Do Experimental and Nonexperimental Evaluations Give Different Answers about the Effectiveness of Government-Funded Training Programs? In Social Experimentation, Program Evaluation, and Public Policy (pp. 37-64). Blackwell Publishing Ltd. https://doi.org/10.1002/9781444307399.ch6

Do Experimental and Nonexperimental Evaluations Give Different Answers about the Effectiveness of Government-Funded Training Programs? / Greenberg, David H.; Michalopoulos, Charles; Robins, Philip.

Social Experimentation, Program Evaluation, and Public Policy. Blackwell Publishing Ltd, 2009. p. 37-64.

Research output: Chapter in Book/Report/Conference proceedingChapter

Greenberg, DH, Michalopoulos, C & Robins, P 2009, Do Experimental and Nonexperimental Evaluations Give Different Answers about the Effectiveness of Government-Funded Training Programs? in Social Experimentation, Program Evaluation, and Public Policy. Blackwell Publishing Ltd, pp. 37-64. https://doi.org/10.1002/9781444307399.ch6
Greenberg DH, Michalopoulos C, Robins P. Do Experimental and Nonexperimental Evaluations Give Different Answers about the Effectiveness of Government-Funded Training Programs? In Social Experimentation, Program Evaluation, and Public Policy. Blackwell Publishing Ltd. 2009. p. 37-64 https://doi.org/10.1002/9781444307399.ch6
Greenberg, David H. ; Michalopoulos, Charles ; Robins, Philip. / Do Experimental and Nonexperimental Evaluations Give Different Answers about the Effectiveness of Government-Funded Training Programs?. Social Experimentation, Program Evaluation, and Public Policy. Blackwell Publishing Ltd, 2009. pp. 37-64
@inbook{beb96d339a5249ffa318f737d8c5d425,
title = "Do Experimental and Nonexperimental Evaluations Give Different Answers about the Effectiveness of Government-Funded Training Programs?",
abstract = "This paper uses meta-analysis to investigate whether random assignment (or experimental) evaluations of voluntary government-funded training programs for the disadvantaged have produced different conclusions than nonexperimental evaluations. Information includes several hundred estimates from 31 evaluations of 15 programs that operated between 1964 and 1998. The results suggest that experimental and nonexperimental evaluations yield similar conclusions about the effectiveness of training programs, but that estimates of average effects for youth and possibly men might have been larger in experimental studies. The results also suggest that variation among nonexprimental estimates of program effects is similar to variation among experimental estimates for men and youth, but not for women (for whom it seems to be larger), although small sample sizes make the estimated differences somewhat imprecise for all three groups. The policy implications of the findings are discussed.",
keywords = "Data, Descriptive Findings, Method, Regression Results, Some Shortcomings of the Descriptive Findings, Variation in Impacts",
author = "Greenberg, {David H.} and Charles Michalopoulos and Philip Robins",
year = "2009",
month = "4",
day = "21",
doi = "10.1002/9781444307399.ch6",
language = "English (US)",
isbn = "9781405193931",
pages = "37--64",
booktitle = "Social Experimentation, Program Evaluation, and Public Policy",
publisher = "Blackwell Publishing Ltd",

}

TY - CHAP

T1 - Do Experimental and Nonexperimental Evaluations Give Different Answers about the Effectiveness of Government-Funded Training Programs?

AU - Greenberg, David H.

AU - Michalopoulos, Charles

AU - Robins, Philip

PY - 2009/4/21

Y1 - 2009/4/21

N2 - This paper uses meta-analysis to investigate whether random assignment (or experimental) evaluations of voluntary government-funded training programs for the disadvantaged have produced different conclusions than nonexperimental evaluations. Information includes several hundred estimates from 31 evaluations of 15 programs that operated between 1964 and 1998. The results suggest that experimental and nonexperimental evaluations yield similar conclusions about the effectiveness of training programs, but that estimates of average effects for youth and possibly men might have been larger in experimental studies. The results also suggest that variation among nonexprimental estimates of program effects is similar to variation among experimental estimates for men and youth, but not for women (for whom it seems to be larger), although small sample sizes make the estimated differences somewhat imprecise for all three groups. The policy implications of the findings are discussed.

AB - This paper uses meta-analysis to investigate whether random assignment (or experimental) evaluations of voluntary government-funded training programs for the disadvantaged have produced different conclusions than nonexperimental evaluations. Information includes several hundred estimates from 31 evaluations of 15 programs that operated between 1964 and 1998. The results suggest that experimental and nonexperimental evaluations yield similar conclusions about the effectiveness of training programs, but that estimates of average effects for youth and possibly men might have been larger in experimental studies. The results also suggest that variation among nonexprimental estimates of program effects is similar to variation among experimental estimates for men and youth, but not for women (for whom it seems to be larger), although small sample sizes make the estimated differences somewhat imprecise for all three groups. The policy implications of the findings are discussed.

KW - Data

KW - Descriptive Findings

KW - Method

KW - Regression Results

KW - Some Shortcomings of the Descriptive Findings

KW - Variation in Impacts

UR - http://www.scopus.com/inward/record.url?scp=84889490535&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84889490535&partnerID=8YFLogxK

U2 - 10.1002/9781444307399.ch6

DO - 10.1002/9781444307399.ch6

M3 - Chapter

AN - SCOPUS:84889490535

SN - 9781405193931

SP - 37

EP - 64

BT - Social Experimentation, Program Evaluation, and Public Policy

PB - Blackwell Publishing Ltd

ER -