Conducting meta-analyses of evaluations of government-funded training programs

David H. Greenberg, Philip K. Robins, Robert Walker

Research output: Contribution to journalArticlepeer-review

2 Scopus citations


Government-funded training programs in the United States have often been subject to rigorous evaluation. Indeed, many of these programs have been evaluated with random assignment, although sophisticated quasi-experimental methods have also been used. Until very recently, however, there has been little systematic attempt to use the cumulative information vested in these evaluations to attempt determine which kinds of programs work best in which setting and with respect to which types of client. Meta-analysis - a set of statistical procedures for systematically synthesizing findings from separate studies - can, in theory at least, address these and other topics that evaluation of individual programs cannot. This article discusses the steps in conducting such a synthesis, summarizes the results of three recently conducted meta-analyses of training and welfare-to-work programs, identifies limitations to the meta-analytic approach, and considers ways in which some of these limitations can be overcome.

Original languageEnglish (US)
Pages (from-to)345-367
Number of pages23
JournalReview of Policy Research
Issue number3
StatePublished - May 1 2005

ASJC Scopus subject areas

  • Geography, Planning and Development
  • Public Administration
  • Management, Monitoring, Policy and Law


Dive into the research topics of 'Conducting meta-analyses of evaluations of government-funded training programs'. Together they form a unique fingerprint.

Cite this