Mechanism-Based Explanations in the Impact Evaluation of Public Interventions: Digging Deeper into Why and How Programmes Worked or Failed to Work

Authors

DOI:

https://doi.org/10.15678/ZP.2021.58.4.02

Keywords:

evidence-based policy, evaluation, theory of change, causal mechanisms, realist synthesis

Abstract

Objectives:
The aim of the article is to consider the evaluation of public interventions through the prism of evidencebased policy (EBP) as well as, more specifically, its potential to address the problem of how to produce in the process of impact evaluation usable knowledge that can help improve policymaking and policy implementation which can be accumulated over time, where evaluations will not be single endeavours and one-off studies, but will contribute to the growing body of knowledge.

Research Design & Methods:
The article provides a critical overview of the research literature on evaluation approaches and methods as tools for gathering and apprising evidence relevant for policymaking and policy implementation.

Findings:
Building upon the identified limitations of the traditional input/output approach to impact evaluation of public interventions, alternative approaches to evaluation are considered that make use of a theory that properly explicates the causal mechanisms linking programme activities with programme outcomes, confronting the mechanisms with empirical observations. As a strategy for synthesising the gained knowledge, the realist synthesis is considered as being more appropriate for reviewing research on complex social interventions (rather than traditional meta-analysis).

Implications  /  Recommendations:
The article demonstrates how theory-based evaluation with mechanistic explanation and realist synthesis can contribute to growing evidence for policy needs, identifying also their limitations and practical problems related to their implementation.

Contribution  /  Value  Added:
The article contributes to the existing pool of knowledge by providing important insights into how to use mechanism-based explanations in impact evaluation to make stronger causal claims and enhance policy-learning.

Downloads

Download data is not yet available.

References

Andersson, T., Appelquist, J., & Serger, S. (2004). “Public Research and Innovation Policy for the good of society: How to assess the way forward?” Background Paper at joint IKED/INNOVA seminar. November 8, 2004. Berns, Salonger, Stockholm, pp. 17–34. Available at: http://www.iked.org/pdf/Evalproceedings05.06.03.pdf (accessed: 09.10.2021).

Astbury, B., & Leeuw, F. (2010). Unpacking Black Boxes: Mechanisms and Theory Building in Evaluation. American Journal of Evaluation, 31(3), 363–381.

Bach, L., & Matt, M. (2005). From economic foundations to S&T policy tools: A comparative analysis of the dominant paradigms. In P. Llerena & M. Matt (Eds.), Innovation Policy in a Knowledge-Based Economy (pp. 17–46). Springer.

Beach, D., & Pedersen, R. (2016). Causal Case Study Methods. Foundations and Guidelines for Comparing, Matching, and Tracing. University of Michigan Press.

Befani, B. (2012). Development Models of Causality and Causal Inference. Appendix pp. 1–24, In E. Stern, N. Stame, J. Mayne, K. Forss, R. Davies, & B. Befani (Eds). Broadening the Range of Designs and Methods for Impact Evaluations, DFID Working Paper 38. Department for International Development.

Biedenbach, T., Marell, A., & Vanyushyn, V. (2018). Industry-university collaboration and absorptive capacity: An empirical study in a Swedish context. International Journal of Technology Management, 76(1/2), 81–103.

BIS (2011). Equity finance programmes qualitative reviews of: a) ukhtf and b) the bridges fund. Available at: https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/32220/11-1009-bis-equity-finance-qualitativereviews-ukhtf-bridges.pdf (accessed: 07.05.2022).

Bruder, M., Polak, J., Guffler, K., & Scheinert, L. (2020). Using mediation analysis to uncover psychological mechanisms of attitude change in a development volunteer program. New Directions for Evaluation, 167, 131–143.

Byrne, D. (2013), Evaluating complex social interventions in a complex world. Evaluation, 19(3), 217–228. https://doi.org/10.1177/1356389013495617.

Chaney, B. (2015). Reconsidering Findings of “No Effects” in Randomized Control Trials: Modeling Differences in Treatment Impacts. American Journal of Evaluation, 37(1), 45–62. https://doi.org/10.1177/1098214015573788.

Chen, H., & Rossi, P. (1989). Issues in the Theory-driven Perspective. Evaluation and Program Planning, 12(4), 299–306.

Cook, T. (2000). The false choice between theory-based evaluation and experimentation. New Directions for Evaluation, 87, 27–34.

Dreyfus, H. (2011). Medicine as combining natural and human science. The Journal of Medicine and Philosophy, 36(4), 335–341. https://doi.org/10.1093/jmp/jhr027.

European Commission (2017). Quality of Public Administration. A Toolbox for Practitioners. Publications Office of the European Union.

Head, B. (2010). Reconsidering evidence-based policy: Key issues and challenges. Policy and Society, 29(2), 77–94. https://doi.org/10.1016/j.polsoc.2010.03.001.

Hedström, P., & Wennberg, K. (2017). Causal mechanisms in organization and innovation studies. Innovation: Organization & Management, 19(1), 91–102. https://doi.org/10.1080/14479338.2016.1256779.

Jagosh, J. (2019). Realist Synthesis for Public Health: Building an Ontologically Deep Understanding of How Programs Work, For Whom, and In Which Contexts. Annual Review of Public Health, 40, 361–372. https://doi.org/10.1146/annurev-publhealth-031816-044451.

Józefowski, B. (2012). Evidence-based policy – nowa koncepcja dotykająca starych problemów. Zarządzanie Publiczne, 1(19), 147–152.

Kim, P. H., Wennberg, K., & Croidieu, G. (2016). Untapped Riches of Meso-Level Applications in Multilevel Entrepreneurship Mechanisms. The Academy of Management Perspectives, 30(3), 273–291.

Lemire, S., Kwako, A., Nielsen, S. B., Christie, C. A., Donaldson, S. I., & Leeuw, F. L. (2020). What is this thing called a mechanism? Findings from a review of realist evaluations. New Directions for Evaluation, 167, 73–86.

Mayne, J. (2015). Useful Theory of Change Models. Canadian Journal of Program Evaluation, 30(2), 119–142.

OECD (2008). Framework for the Evaluation of SME and Entrepreneurship Policies and Programmes. DOI: 10.1787/9789264040090.

Patton, M. Q. (2008). Utilization-Focused Evaluation (4th ed.). Sage.

Pawson, R. (2003). Nothing as Practical as Good Theory. Evaluation, 9(4), 471–490. https://doi.org/10.1177/135638900300900407.

Pawson, R. (2006). Digging for Nuggets: How ‘Bad’ Research Can Yield ‘Good’ Evidence. International Journal of Social Research Methodology, 9(2), 361–372. https://doi.org/10.1080/13645570600595314.

Pawson, R. (2008). Causality for beginners. In: NCRM Research Methods Festival 2008. (Unpublished) Available at: https://eprints.ncrm.ac.uk/id/eprint/245/

(accessed: 14.10.2021).

Pawson, R. (2013). The Science of Evaluation. A Realist Manifesto. Sage.

Pawson, R., Greenhalgh, T., Harvey, G., & Walshe, K. (2004). Realist synthesis: An introduction. ESRC Research Methods Programme, University of Manchester.

Pawson, R., & Tilley, N. (1997). Realistic Evaluation. Sage. Peck, L. (2020). Leveraging experimental evaluations for understanding causal mechanisms. New Directions for Evaluation, 167, 145–160.

Picciotto, R. (2012). Experimentalism and development evaluation: Will the bubble burst? Evaluation, 18(2), 213–229. https://doi.org/10.1177/1356389012440915.

Punton, M., Vogel, I., & Llyod, R. (2016). Reflections from a realist evaluation in progress: Scaling ladders and stitching theory (CDI Practice Paper). Institute of Development Studies. Available at: https://archive.ids.ac.uk/cdi/publications/reflectionsrealist-evaluation-progress-scaling-ladders-andstitching-theory-0.html (accessed: 30.04.2020).

Rousseau, D. (2006). Is There Such a Thing as “Evidence-Based Management”? Academy of Management Review, 31(2), 256–269.

Rycroft-Malone, J., McCormack, B., Hutchinson, A., DeCorby, K., Bucknall, T., Kent, B., Schultz, A., Snelgrove-Clarke, E., Stetler, C., Titler, M., Wallin L., & Wilson V. (2012). Realist synthesis: Illustrating the method for implementation research. Implementation Science, 7(33), 1–10.

Saunders, M. (2011). Capturing effects of interventions, policies and programmes in the European context: A social practice perspective. Evaluation, 17(1), 89–102. https://doi.org/10.1177/1356389010394480.

Schmitt, J. (2020). The causal mechanism claim in evaluation: Does the prophecy fulfil? Causal Mechanisms in Program Evaluation. New Directions for Evaluation, 167, 11–26.

Stame, N. (2004). Theory-based Evaluation and Types of Complexity. Evaluation, 10(1), 58–76. https://doi.org/10.1177/1356389004043135.

Storey, D. J. (2000). Six Steps to Heaven: Evaluating the Impact of Public Policies to Support Small Business in Developed Economies. In D. L. Sexton & H. Landstrom (Eds.), Handbook of Entrepreneurship (pp. 176–193). Blackwell.

Weiss, C. (1997). Theory-based Evaluation: Past, Present and Future. New Directions for Evaluation, 76, 41–55. https://doi.org/10.1002/ev.1086.

Downloads

Published

2022-11-21

How to Cite

Kubera, P. (2022). Mechanism-Based Explanations in the Impact Evaluation of Public Interventions: Digging Deeper into Why and How Programmes Worked or Failed to Work. Journal of Public Governance, 58(4), 17–26. https://doi.org/10.15678/ZP.2021.58.4.02