Colombia Internacional

Colomb. int. | eISSN 1900-6004 | ISSN 0121-5612

On the Use of Evidence and External Validity in the Evaluation of Social Interventions: A Critical Overview

No. 105 (2021-01-01)
  • Juan David Parra
    Universidad del Norte (Colombia)

Abstract

Objective/Context: There is a growing criticism of the mainstream evaluations of the social interventions (e.g. RCTs) which shape evidence-based policies. This article focuses on two central aspects of this question: i) the notion of evidence, as a concept and a result of a process that entails reasoning, and ii) the understanding of external validity. Methodology: Through an analysis of the literature on evaluation and the philosophy of knowledge, I deconstruct the concepts of causality in the social sciences. This exercise allows me to distinguish between successionist and generative theories of causality and establish criteria to critically examine some epistemological postulates of experimental evaluation techniques. Conclusions: By themselves, experimental methods of evaluation do not allow for informed decisions about an efficient investment of resources. Despite their strength in quantifying possible causal effects, counterfactual statistical analyses need to be complemented by forms of qualitative reasoning in order to answer questions about the direct and indirect causes of the results of social interventions, and thus strengthen our understanding of the extrapolation of social policies or programs from certain contexts. Contribution: There are few studies in Spanish which criticize experimental evaluation techniques or suggest alternatives to them. Instead of summarizing the arguments of other authors, this article presents a coherent narrative which asks us to rethink the current role of the evaluation of social interventions.

Keywords: Evaluation, experimental techniques, evidence, external validity

References

Bernal, Raquel y XimenaPeña. 2011. Guía práctica para la evaluación de impacto. Bogotá: Ediciones Uniandes.

Blamey, Avril y MhairiMackenzie. 2007. “Theories of Change and Realistic Evaluation: Peas in a Pod or Apples and Oranges?”. Evaluation 13 (4): 439-455.

Bowles, Samuel. 2008. “Endogenous Preferences: The Cultural Consequences of Markets and Other Economic Institutions”. Journal of Economic Literature 36 (1): 75-111.

Brouselle, Astrid y Jean-MarieBurejeya. 2018. “Theory-based Evaluations: Framing the Existence of the New Theory Evaluation and the Rise of the 5th Generation”. Evaluation 24 (2): 153-168.

Cardozo, Myriam. 2013. “De la evaluación a la reformulación de políticas públicas”. Política y Cultura 40: 123-149.

Cartwright, Nancy. 2012. “Presidential Address: Will This Policy Work for You? Predicting Effectiveness Better: How Philoshophy Helps”. Philosophy of Science 79 (5): 973-989.

Cartwright, Nancy. 2013. “Knowing What We Are Talking About: Why Evidence Doesn’t Always Travel”. Evidence & Policy 9 (1): 97-112.

Cartwright, Nancy. 2017. “Single Case Causes: What is Evidence and Why”. En Philosophy of Science in Practice: Nancy Cartwright and the Nature of Scientific Reasoning, editado por Hsiang-KeChao y JulianReiss, 11-24. Cham: Springer.

Cartwright, Nancy y JeremieHardie. 2012. Evidence-Based Policy: A Practical Guide to Doing It Better. Oxford: Oxford University Press.

Cartwright, Nancy y JeremieHardie. 2017. “Predicting What Will Happen When You Intervene”. Clinical Social Work Journal 45 (3): 270-279.

Danermark, Berth, MatsEkström, LiselotteJakobsen y Jan ChKarlsson. 2002. Explaining Society: Critical Realism in the Social Sciences. Nueva York: Psychology Press.

Deaton, Angus. 2010. “Instruments, Randomization, and Learning about Development”. Journal of Economic Literature 48: 424-455.

Deaton, Angus y NancyCartwright. 2018. “Understanding and Misunderstanding Randomized Controlled Trials”. Social Science & Medicine 210: 2-21.

Ellis, George. 2005. “Physics, Complexity and Causality”. Nature 435 (743).

Gertler, Paul J., SebastiánMartínez, PatrickPremand, Laura B.Rawlings y Christel M. J.Vermeersch. 2011. Evaluación de impacto en la práctica. Washington D. C.: Banco Mundial.

Goodman, Lisa, DeborahEpstein y CrisSullivan. 2018. “Beyond the RCT: Integrating Rigor and Relevance to Evaluate the Outcomes of Domestic Violence Programs”. American Journal of Evaluation 39 (1): 58-70.

Greenhalgh, Trish y CraigRussell. 2009. “Evidence-Based Policymaking: A Critique”. Perspectives in Biology and Medicine 52 (2): 304-318.

Hallam, Susan. 2010. “The Power of Music: Its Impact on the Intellectual, Social and Personal Development of Children and Young People”. International Journal of Music Education 28 (3): 269-289.

Harman, Graham. 2018. Object-Oriented Ontology. Londres: Pelican Books.

Hausmann, Ricardo. 2016. “El problema con las políticas basadas en evidencia”. La Nación, 3 de marzo. https://www.nacion.com/opinion/foros/el-problema-con-las-politicas-basadas-en-evidencia/ISW3U6CZYNHEVND43YO2XZWFBY/story/

Head, Brian. 2016. “Toward More ‘Evidence‐Informed’ Policy Making?”. Public Administration Review 76 (3): 472-484.

Henrich, Joseph, RobertBoyd, SamuelBowles, ColinCamerer, ErnstFehr, HerbertGintis y RichardMcElreath. 2001. “In Search of Homo Economicus: Behavioral Experiments in 15 Small-Scale Societies”. American Economic Review 91 (2): 73-78.

Joyce, Kathryn y NancyCartwright. 2020. “Bridging the Gap Between Research and Practice: Predicting What Will Work Locally”. American Educational Research Journal 57 (3): 1045-1082.

Krause, Philipp y GonzaloHernández. 2020. “Commentary: From Experimental Findings to Evidence-Based Policy”. World Development 127. https://doi.org/10.1016/j.worlddev.2019.104812

Krauss, Alexander. 2015. “The Scientific Limits of Understanding the (Potential) Relationship Between Complex Social Phenomena: The Case of Democracy and Inequality”. Journal of Economic Methodology 23 (1): 97-109.

Krauss, Alexander. 2018. “Why All Randomised Controlled Trials Produce Biased Results”. Annals of Medicine 50 (4): 312-322.

Lawson, Tony. 2009. “Applied Economics, Contrast Explanation and Asymmetric Information”. Cambridge Journal of Economics 33 (3): 405-419.

Manzano, Ana. 2010. “El análisis del contexto local en un programa multidisciplinario (sanidad y servicios sociales) usando el enfoque de la evaluación realista”. E-valuacion 3 (10): 24-27.

Masino, Serena y MiguelNiño-Zarazúa. 2016. “What Works to Improve the Quality of Student Learning in Developing Countries?”. International Journal of Educational Development 48: 53-65.

Monaghan, Mark, RayPawson y KateWicker. 2012. “The Precautionary Principle and Evidence-Based Policy Making”. Evidence & Policy 8 (2): 171-191.

Munro, Eileen, NancyCartwright, JeremyHardie y EleonoraMontuschi. 2016. Improving Child Safety: Deliberation, Judgement and Empirical Research. Durham: Centre for Humanities Engaging Science and Society (Chess). Philosophy Department, Durham University.

Parra, Juan David. 2013. “Preferencias endógenas, prosocialidad y políticas públicas”. Divergencia 15: 64-71.

Parra, Juan David. 2016. “Realismo crítico: una alternativa en el análisis social”. Sociedad y Economía 31: 215-238.

Parra, Juan David. 2017. “¿Qué funciona, para quién, en qué aspectos, hasta qué punto, en qué contexto y cómo? Una introducción a la evaluación realista y sus métodos”. Economía & Región 11 (2): 11-44.

Parra, Juan David. 2018. “Critical Realism and School Effectiveness Research in Colombia: The Difference It Should Make”. British Journal of Sociology of Education 39 (1): 107-125.

Parra, Juan David. 2019. “El arte del muestreo cualitativo y su importancia para la evaluación y la investigación de políticas públicas: una aproximación realista”. Opera 25: 119-136.

Pawson, Ray. 2013. The Science of Evaluation: A Realist Manifesto.Londres: Sage.

Pawson, Ray y NickTilley. 1997. Realistic Evaluation. Londres; Nueva Delhi: Thousand Oaks.

Pawson, Ray, GeoffWong y LesleyOwen. 2011. “Known Knowns, Known Unknowns, Unknown Unknowns: The Predicament of Evidence-Based Policy”. American Journal of Evaluation 32 (4): 518-546.

Porter, Sam, TraceyMcConnell y JoanneReidcor. 2017. “The Possibility of Critical Realist Randomised Controlled Trials”. Trials 18: 133.

Reiss, Julian. 2018. “Against External Validity”. Synthese 196 (8): 3103-3121.

Saltelli, Andrea y MarioGiampietro. 2017. “What Is Wrong with Evidence Based Policy, and How Can It Be Improved?”. Futures 91: 62-71.

Van Belle, Sara, Geoff Wong, Gill Westhorp, Mark Pearson, Nick Emmel, AnaManzano y BrunoMarchal. 2016. “Can ‘Realist’ Randomised Controlled Trials Be Genuinely Realist?”. Trials 17 (1): 2-6.

Verger, Antoni, XavierBonal y AdriánZancajo. 2016. “What Are the Role and Impact of Public-Private Partnerships in Education? A Realist Evaluation of the Chilean Education Quasi-Market”. Comparative Education Review 60 (2): 223-248.

License