Ontologies, methods and evidences: fostering the use of mixed-methods with accuracy

Titolo Rivista RIV Rassegna Italiana di Valutazione
Autori/Curatori Giancarlo Vecchi
Anno di pubblicazione 2019 Fascicolo 2017/69 Lingua Inglese
Numero pagine 16 P. 154-169 Dimensione file 418 KB
DOI 10.3280/RIV2017-069009
Il DOI è il codice a barre della proprietà intellettuale: per saperne di più clicca qui

Qui sotto puoi vedere in anteprima la prima pagina di questo articolo.

Se questo articolo ti interessa, lo puoi acquistare (e scaricare in formato pdf) seguendo le facili indicazioni per acquistare il download credit. Acquista Download Credits per scaricare questo Articolo in formato PDF

Anteprima articolo

FrancoAngeli è membro della Publishers International Linking Association, Inc (PILA)associazione indipendente e non profit per facilitare (attraverso i servizi tecnologici implementati da CrossRef.org) l’accesso degli studiosi ai contenuti digitali nelle pubblicazioni professionali e scientifiche

Il dibattito su ‘evidence-based policy making’ affronta il tema del ruolo della valutazione, e in generale delle scienze sociali, nell’influenzare il disegno e l’attuazione dei programmi pubblici. La riflessione è spesso tradotta in termini metodologici e ciò che emerge è la predilezione per i disegni di valutazione controfattuale. Diversamente l’autore sottolinea l’esigenza di adottare un approccio pluralista nell’utilizzo dei metodi e dei disegni di ricerca, considerando i diversi significati che il termine ‘evidenza’ può avere se si considera sia la complessità delle politiche pubbliche sia i bisogni informativi dei policy-makers. Su questa base, l’autore suggerisce alcune strategie per migliorare la rilevanza del ruolo dei valutatori nei processi di policy-making.

Keywords:Valutazione; Policy-Making Basato su Evidenze; Ontologie; Learning; Trading-Zones.

  1. Williams, A. 2010. “Is Evidence-Based Policy Making Really Possible? Reflections for Policy Makers and Academics on Making Use of Research in the Work of Policy.” In Hal K. Colebatch, Robert Hoppe and Mirko Noordegraaf. Eds. Working for Policy. Amsterdam: Amsterdam University Press, pp. 195-209.
  2. Young, K., D. Ashby, A. Boaz and L. Grayson. 2002. “Social Science and the Evidence-Based Policy Movement.” Social Policy and Society 1(3): 215-224.
  3. Majone, G. 1989. Evidence, Argument, and Persuasion in the Policy Process. New Haven and London: Yale University Press.
  4. Maxwell, S. and D. Stone. Eds. 2005. Global Knowledge Networks and International Development: Bridges Across Boundaries. Abingdon (UK): Routledge.
  5. Bagshaw, S. M. and R. Bellomo. 2008. “The need to reform our assessment of evidence from clinical trials: A commentary.” Philosophy, Ethics, and Humanities in Medicine 3:23.
  6. Barzelay, M. 2007. “Learning from Second-Hand Experience: Methodology for Extrapolation-Oriented Case Research.” Governance 20(3): 521-543.
  7. Bastow, S., P. Dunleavy and J. Tinkler. 2014a. Measuring the Impact of Social Science Research in UK Central Government Policy Making. Political Studies Association Annual Conference, Manchester.
  8. Bastow, S., P. Dunleavy and J. Tinkler. 2014b. The Impact of Social Science. How Academics and their Research Make a Difference. London: Sage.
  9. Beach, D. and R. B. Pedersen. 2013. Process-Tracing Methods. Foundations and Guidelines. Ann Arbor: University of Michigan Press.
  10. Bevir, M. and R.A.W. Rhodes. 2016. “Interpretive Political Science. Mapping the Field.” In M. Bevir and R.A.W. Rhodes (eds.). Routledge Handbook of Interpretive Political Science. London: Routledge, pp. 4-27.
  11. Booth, W. C., G. G. Colomb and J. M. Williams. 2008. The Craft of Research. Chicago: Chicago University Press.
  12. Cartwright, N. and J. Hardie. 2012. Evidence-Based Policy. A Practical Guide to Doing It Better. Oxford: Oxford University Press.
  13. Chaterjee, A. 2011. “Ontology, Epistemology, and Multimethod Research in Political Science.” Philosophy of the Social Sciences 43(1): 73-99.
  14. Creswell, J. W. 2014. Research Design. Qualitative, Quantitative and Mixed Methods Approaches. Los Angeles: Sage.
  15. Davis, H. T.O., S. M. Nutley and P. C. Smith. Eds. 2000. What Works? Evidence-based policy and practice in public services. Bristol: The Policy Press.
  16. Goertz, G. and J. Mahoney. 2012a. “Concepts and Measurement: Ontology and Epistemology.” Social Science Information 51(2): 205-2016.
  17. Goertz, G. and J. Mahoney. 2012b. A Tale of Two Culture. Qualitative and Quantitative Research in the Social Sciences. Princeton, NJ: Princeton University Press.
  18. Grimmer, J. 2015. “We're All Social Scientists Now: How Big Data, Machine Learning, and Causal Inference Work Together.” PS: Political Science and Politics, 48(1): 80-83.
  19. Guba, E. G. and Y. S. Lincoln. 1989. Fourth Generation Evaluation. Newbury Park (CA): Sage.
  20. Fisher, F. 1995. Evaluating Public Policy. Chicago: Nelson-Hall Publishers.
  21. Foster, I., R. Ghani, R. S. Jarmin, F. Kreuter, J. Lane. Eds. 2017. Big data and Social Science. A Practical Guide to Methods and Tools. Boca Raton (FL): CRC Press.
  22. Furlong, P. and D. Marsh. 2010 (3d Ed.). “A Skin Not a Sweater: Ontology and Epistemology in Political Science.” In D. Marsh and G. Stoker. (Eds.) Theory and Methods in Political Science. Basingstoke: Palgrave Macmillan, pp. 184-211.
  23. Hall, P. 2003. “Aligning Ontology and Methodology in Comparative Politics.” In J. Mahoney and D. Rueschemeyer. Eds. Comparative Historical Analysis in the Social Sciences. New York: Cambridge University Press, pp. 373-404.
  24. Heckman, J.J. 2005. “The Scientific Model of Causality.” Sociological Methodology 35(1): 1-97.
  25. Hedström, P. and R. Swedberg, eds. [1998] 2005. Social Mechanisms. An Analytical Approach to Social Theory. Cambridge: Cambridge University Press.
  26. Johnson, R. B. and A. J. Onwuegbuzie. 2004. “Mixed Methods Research: A Research Paradigm Whose Time Has Come.” Educational Researcher 33(7): 14-26.
  27. King, G., R. O. Kehoane and S. Verba. 1994. Designing Social Inquiry. Scientific Inference in Qualitative Research. Princeton, NJ: Princeton University press.
  28. Lazer, D. et al. 2009. “Computational Social Science.” Science, 323(5915): 721-723.
  29. Lindblom, Ch. E. and D. K. Cohen. 1979. Usable Knowledge. Social Science and Social Problem Solving. New Haven (Conn): Yale University Press.
  30. Mahoney J. and G. Goertz. 2006. “A Tales of Two Cultures: Contrasting Quantitative and Qualitative Research.” Political Analysis 14:227-249.
  31. Maggetti, M., F. Gilardi and C. Radaelli. 2013. “Social Sciences and Research Design.” In M. Maggetti, F. Gilardi and C. Radaelli. Designing Research in the Social Sciences. London: Sage, pp. 1-20.
  32. Majone, G. 1980. “The Uses of Policy Analysis.” Policy Studies Review Annual n. 34, Beverly Hills: Sage, pp. 161-181.
  33. Nutley, S. M., I. Walte and H. T.O. Davis. 2007. Using Evidence. How Research Can Inform Public Services. Bristol: The Policy Press.
  34. Nutley, S. M., A. Powell and H. Davies. 2012. “What Counts as Good Evidence?” -- https://www.nesta.org.uk/sites/default/files/what_counts_as_good_evidence_provocation_paper.pdf accessed May 2d 2017.
  35. Parsons, C. 2010. “Constructivism and Interpretive Theory.” In David Marsh and Gerry Stoker. (Eds.) Theory and Methods in Political Science. Basingstoke: Palgrave Macmillan, pp. 80-98.
  36. Pawson, R. 2013. The Science of Evaluation. A Realist Manifesto. London: Sage.
  37. Pawson, R. and N. Tilley. 1997. Realist Evaluation. London: Sage.
  38. Petticrew, M. and H. Roberts. 2003. “Evidence, hierarchies and typologies: Horses for Courses.” Journal of Epidemiology and Community Health 57: 527–9.
  39. Petticrew, M. 2015. “Time to rethink the systematic review catechism? Moving from ’what works’ to ’what happens’.” Systematic Review 4(36): 1-6.
  40. Radaelli, C. 2017. “Reflecting on the impact agenda to draw lessons for the ‘relevance’ of public policy analysis.” Address at the Seminar: The Future of Public Policy 2, Milan, Politecnico di Milano, March 17th 2017.
  41. Sartori, G. 1991. “Comparing and Miscomparing.” Journal of Theoretical Politics, 3(3): 243-257.
  42. Sayer, A. 2000. Realism and Social Science. London: Sage.
  43. Schram, S. S. and B. Caterino. Eds. 2006. Making Political Science Matter. Debating Knowledge, Research and Method. New York: New York university Press.
  44. Shadish, W. R. Jr., T. D. Cook and L. C. Leviton. 1991. Foundations of Program Evaluation. Theories of Practice. Newbury Park (CA): Sage.
  45. Stame, N. 2016. Valutazione Pluralista. Milano: FrancoAngeli.
  46. Stern, E., N. Stame, J. Mayne, K. Forss, R. Davies and B. Befani. 2012. Broadening the Range of Designs and Methods for Impact Evaluations. Report of a Study commissioned by the Department for International Development. London: DID.
  47. Stoker, G. 2015. “Challenging three blockages to relevance and political science: the obvious, the avoidable and the thorny.” In Gerry Stoker, B. Guy Peters and Jon Pierre. (Eds). The Relevance of Political Science. Basingstoke: Palgrave Macmillan 2015, pp. 19-35.
  48. Stoker, G. and M. Evans. 2016a. “Evidence-Based Policy Making and Social Science.” In Gerry Stoker and Mark Evans. (Eds.). Evidence-Based Policy Making in the Social Sciences. Methods that Matters. Bristol: The Policy Press, pp. 15-27.
  49. Stoker, G. and M. Evans. 2016b. “Crafting Public Policy: Choosing the Right Social Science Method.” In Gerry Stoker and Mark Evans. (Eds.). Evidence-Based Policy Making in the Social Sciences. Methods that Matters. Bristol: Policy Press, pp. 29-40.
  50. Stone, D., S. Maxwell and M. Keating. 2001. Bridging Research and Policy. (An International Workshop Funded by the UK Department for International Development Radcliffe House, Warwick University 16-17 July 2001).
  51. Tenbensel, T. 2006. “Does More Evidence Lead to Better Policy?” Policy Studies 25(3): 189-207.
  52. Vecchi, G. 2013. “Studying Good Practices to Lesson-Drawing and Transfer: Introduction to the Causal Mechanisms Approach. A proposal for Exchanges Among European Networks on Time-oriented Policies”, in Dietrich Henkel et al. Space-Design of the Public City. Berlin: Springer, pp. 255-288.
  53. Weiss, C. H. 1988. “Evaluation for Decisions: Is Anybody There? Does Anybody Care?” Evaluation Practice, 9(1): 5-19.

Giancarlo Vecchi, Ontologies, methods and evidences: fostering the use of mixed-methods with accuracy in "RIV Rassegna Italiana di Valutazione" 69/2017, pp 154-169, DOI: 10.3280/RIV2017-069009