Per un ri-orientamento del dibattito italiano sulla valutazione della qualità della produzione scientifica

Titolo Rivista SOCIOLOGIA E POLITICHE SOCIALI
Autori/Curatori Ivo Colozzi
Anno di pubblicazione 2015 Fascicolo 2015/2 Lingua Italiano
Numero pagine 19 P. 111-129 Dimensione file 85 KB
DOI 10.3280/SP2015-002006
Il DOI è il codice a barre della proprietà intellettuale: per saperne di più clicca qui

Qui sotto puoi vedere in anteprima la prima pagina di questo articolo.

Se questo articolo ti interessa, lo puoi acquistare (e scaricare in formato pdf) seguendo le facili indicazioni per acquistare il download credit. Acquista Download Credits per scaricare questo Articolo in formato PDF

Anteprima articolo

FrancoAngeli è membro della Publishers International Linking Association, Inc (PILA)associazione indipendente e non profit per facilitare (attraverso i servizi tecnologici implementati da CrossRef.org) l’accesso degli studiosi ai contenuti digitali nelle pubblicazioni professionali e scientifiche

The essay examines the way in which the last Quality Research Assessment VQR 2004-2010 was performed in Italy. The critique of the "absolute" separation of the products of research from the processes and structures implementing them, prevents the Assessment from being a reflexive tool for the scientific community and indicators from being useful to the scientific activity. The essay incorporates specific and collective reflections taking place in the Italian Sociological Association and in other scientific associations of the Area 14 of the National University Council. Referring to these insights, the author underlines the contradictions and overlapping of the proposed evaluation criteria and the way in which they were operationalized and assembled into a single index. In this light, proposals for improvement are developed for the next Research assessment; in particular, the author suggests the use of Cantril scales and indicators constructed on the basis of "circumstantial" evidences, which will assign greater responsibility to the evaluator, who must thus document and properly explain the rationale for judgments.;

Keywords:Evaluation; Evaluation of University Research; Research Assessment; Peer Evalutation

  1. Baccini, A. 2010. Valutare la ricerca scientifica. Uso e abuso degli strumenti bibliometrici, Bologna: il Mulino.
  2. Bornmann, L. 2007. Bias cut. women, it seems, often get a raw deal in science – so how can discrimination be tackled? Nature, 445, p. 566.
  3. — 2011. Mimicry in science? Scientometrics, 86, pp. 173–177.
  4. Bornmann, L., Daniel, H. D. 2005a. Committee peer review at an international research foundation: predictive validity and fairness of selection decisions on postgraduate fellowship applications. Research Evaluation, 14, pp.15-20.
  5. — 2005b Selection of research fellowship recipients by committee peer review. reliability, fairness and predictive validity of board of trustees’ decisions. Scientometrics, 63, pp. 297–320
  6. — 2010a The validity of staff editors’ initial evaluations of manuscripts: a case study of Angewandte Chemie International Edition. Scientometrics, 85, pp. 681-687.
  7. — 2010b. The usefulness of peer review for selecting manuscripts for publication: a utility analysis taking as an example a high-impact journal. PLoS ONE, 5, e11344. DOI: 10.1371/journal.pone.001134
  8. Boudon, R. 2009. Il posto del disordine. Bologna: il Mulino.
  9. Casati, F., Marchese, M., Mirylenka, K., and Ragone, A. 2010. Reviewing Peer Review: A Quantitative Analysis of Peer Review. Technical Report 1813. University of Trento --http://eprints.biblio.unitn.it/ archive/00001813/Ceci, S. J., Peters, D. P. 1982. Peer review: a study of reliability. Change, 14, pp. 44-48.
  10. Ceci, S. J., Williams, W.M. 2011. Understanding current causes of women’s underrepresentation in science. Proceedings of the National Academy of Sciences of the United States of America, 108, pp. 3157-3162
  11. Cole, J. R., Cole S. 1967. Scientific Output and Recognition: A Study in the Operation of the Reward. System in Science. American Sociological Review, 6, pp. 23-29.
  12. — 1971. Measuring the quality of sociological research: problems in the use of Science Citation Index. The American Sociologist, 6, pp. 23-29.
  13. De Robbio, A. 2008. Analisi citazionale e indicatori bibliometrici nel modello open access. Bollettino AIB, 3, pp. 257-288. --http://eprints.rclis.org/10686/1/valutazione-23gennaio2008.pdf
  14. European Commission, 2010. Assessing Europe’s University-Based Research, Luxemburg: European Union.
  15. Garfield, E. 1955. Citation indexes for science: a new dimension in documentation through association of ideas. Science, 122, pp. 108-111.
  16. Harnad, S. 2007. Open Access Scientometrics and the UK Research Assessment Exercise. Preprint of Invited Keynote Address to 11th Annual Meeting of the International Society for Scientometrics and informetrics. Madrid, Spain, 25-27 June 2007. --http://arxiv.org/ftp/cs/papers/0703/0703131.pdf.
  17. Hirschman, A.O. 1995. Felicità privata e felicità pubblica, Bologna: il Mulino.
  18. Ingelfinger, F.J. 1974. Peer review in biomedical publication. American Journal of Medicine, 56, pp. 686-692.
  19. Kuhn, T.S. 1979. La struttura delle rivoluzioni scientifiche, Torino: Einaudi.
  20. Kyvik, S. 2003. Changing trends in publishing behaviour among university faculty, 1980-2000, Scientometrics, 58, pp. 35-48.
  21. Kousha, K., Thelwall, M. 2014. Web Impact Metrics for Research Assessment. In B. Cronin, C.R. Sugimoto, eds. Beyond Bibliometrics: Harnessing Multidimensional Indicators of Scholarly Impact, Boston, MIT Press.
  22. Kriegeskorte, N. Deca D. eds. 2012. Beyond Open Access: visions for open evaluation of scientific papers by post-publication peer-review, Lausanne: Frontiers Media SA.
  23. Latour, B. 1998. La scienza in azione. Introduzione alla sociologia della scienza. Torino: Edizioni di Comunità.
  24. Link, A.M. 1998. Us and non-US submissions: an analysis of reviewer bias.
  25. JAMA, 280, pp. 246-247 Lawrence, P.A. 2003. The politics of publication. Authors, reviewers and editors must act to protect the quality of research. Nature, 422, pp. 259-261.
  26. Lynch, J.R., Cunningham, M.R., Warme, W.J., Schaad, D.C., Wolf, F.M., Leopold, S.S., 2007. Commercially funded and united states-based research is more likely to be published; good-quality studies with negative outcomes are not. Journal of Bone and Joint Surgery (American Volume), 89, pp. 1010-1018.
  27. Marsh, H.W., Bornmann, L., Mutz, R., Daniel, H.D., and O’Mara, A. 2009. Gender effects in the peer reviews of grant proposals: a comprehensive meta-analysis comparing traditional and multilevel approaches. Revue of Educational Research, 79, pp. 1290-1326.
  28. Merton, R.K., 1988. The Matthew Effect in Science, II Cumulative Advantage and the Symbolism of Intellectual Property. Isis, 79, pp. 606-623.
  29. Moed, H. F., Garfield,E. 2004. In basic science the percentage of ‘authoritative’ references decreases as bibliographies become shorter. Scientometrics, 60, pp. 295-303 .
  30. Ragone, A., Mirylenka, K., Casati, F., Marchese, M. 2011. A quantitative analysis of peer review, in 13th International Society of Scientometrics and Informetrics Conference, Durban.
  31. Reinhart, M. 2009. Peer review of grant applications in biology and medicine. Reliability, fairness, and validity. Scientometrics, 81, pp. 789-809.
  32. Research Evaluation and Policy Project. 2005. Quantitative indicators for research assessment – a literature review. REPP discussion paper, 5/1, Research School of Social Sciences, The Australian National University.
  33. Seglen, P.O. 1998. Citation rates and journal impact factors are not suitable for evaluation of research. Acta Orthopaedica Scandinavica, 69, pp. 224-229.
  34. Smith, A., Eysenck, M. 2002. The Correlation Between RAE Ratings and Citation Counts in Psychology. --http://cogprints.org/2749/1/citations.pdf
  35. Turri, M. 2012, Linee di evoluzione della valutazione nei sistemi universitari europei, Liuc papers, 259, Economia e Impresa, 67, --http://www.biblio.liuc.it/liucpapersita

  • Caratteristiche, tendenze e mutamenti della produzione scientifica sociologica nell'era della valutazione. Analisi esplorativa di un caso di studio Simona Colarusso, Annalisa Di Benedetto, in RIV Rassegna Italiana di Valutazione 66/2017 pp.120
    DOI: 10.3280/RIV2016-066008

Ivo Colozzi, Per un ri-orientamento del dibattito italiano sulla valutazione della qualità della produzione scientifica in "SOCIOLOGIA E POLITICHE SOCIALI" 2/2015, pp 111-129, DOI: 10.3280/SP2015-002006