Characteristics, trends and changes of the sociological scientific production in the evaluation era. Exploratory analysis of a case study

Journal title RIV Rassegna Italiana di Valutazione
Author/s Simona Colarusso, Annalisa Di Benedetto
Publishing Year 2017 Issue 2016/66 Language Italian
Pages 19 P. 120-138 File size 478 KB
DOI 10.3280/RIV2016-066008
DOI is like a bar code for intellectual property: to have more infomation click here

Below, you can see the article first page

If you want to buy this article in PDF format, you can do it, following the instructions to buy download credits

Article preview

FrancoAngeli is member of Publishers International Linking Association, Inc (PILA), a not-for-profit association which run the CrossRef service enabling links to and from online scholarly content.

The paper analyzes, in a diachronic perspective, the sociological scientific production of Sapienza University. The type and languages of publication, and the co-authorship phenomenon are examined from 2004 to 2014, assuming that the implementation of the policies of evaluation of research ("VTR", "VQR") and regulatory changes on careers (comparative evaluations known as "ASN") had an effect on the characteristics of the scientific production. The explorative analysis here presented, with all the limitations of a case study, shows (through the data extracted from the scientific production repository used by this University) different trends of changes in the characteristics of the contributions and in the different ways of publishing, in the decade 2004-2014. The results provide an interesting framework on sociological scientific production and - in relation to the implementation of the evaluation policies - evidence on the effects that the evaluation processes are producing on scientific communication.

Keywords: Scientific Production; Co-Authorship; Research Evaluation; Diachronic Analysis; Evaluation Impact.

  1. Acedo F.J., Barroso C., Casanueva C., Galán J.L. (2006). Co-Authorship in Management and Organizational Studies: An Empirical and Network Analysis. Journal of Management Studies, 43: 957-983.
  2. ANVUR (2013). Valutazione della Qualità della Ricerca 2004-2010. Testo disponibile al sito: http://www.anvur.org/rapporto/. 23/06/2016.
  3. Babchuk N., Keith B., Peters G. (1999). Collaboration in Sociology and Other Scientific Disciplines: A Comparative Trend Analysis of Scholarship in the Social, Physical, and Mathematical Sciences. The American Sociologist, 30: 5-21.
  4. Banfi A. (2014). Diamo i numeri alle scienze umane? Rischi e opportunità. In: Banfi A., Franzini E. e Galimberti P., Non sparate sull’umanista. La sfida della valutazione. Mi-lano: Guerini e Associati.
  5. Biolcati Rinaldi F. (2012). Quali fonti di dati e indicatori bibliometrici per le scienze sociali? Alcuni risultati a partire da uno studio di caso. Polis, 2: 171- 201. DOI: 10.1424/37964
  6. Blasi B. (2013). I prodotti dei Sociologi delle università italiane nella Valutazione della Qualità della Ricerca 2004-2010. Relazione presentata al X Convegno dell’Associazione Italiana di Sociologia “La qualità del sapere sociologico”, Firenze, 11 Ottobre 2013. Testo disponibile al sito: http://ais-elo.it/wp-content/uploads/2014/04/Relazione-VQR-sui-Sociologi-delle-Universit%C3%A0-Italiane.pdf. 20/05/2016.
  7. Bonolis M., Campelli E. (2013). La valutazione del testo scientifico: omaggio a Gadamer. Sociologia e Ricerca Sociale. 100: 5-10. DOI: 10.3280/SR2013-100001
  8. Bornmann L. (2011). Mimicry in science?. Scientometrics, 86: 173-177. DOI 10.1007/s11192-010-0222-8.
  9. Butler L. (2003a). Modifying publication practices in response to founding formulas. Re-search Evaluation, 12: 39-46. DOI: 10.3152/147154403781776780
  10. Butler L. (2003b). Explaining Australia’s increased share of ISI publications - the effects of a founding formula based on publication counts. Research Policy, 31: 143-155. DOI: 10.1016/S0048-7333(02)00007-0
  11. Colarusso S., Santonastaso G. (2015). La qualità della ricerca nel contesto italiano: tra critica e “buona” valutazione. Comunicazionepuntodoc, 12: 195-210.
  12. Colozzi I. (2015). Per un ri-orientamento del dibattito italiano sulla valutazione della qualità della produzione scientifica. Sociologia e Politiche Sociali, 2: 111-129. DOI: 10.3280/SP2015-002006
  13. De Bellis N. (2014). Introduzione alla bibliometria: dalla teoria alla pratica. Roma: AIB.
  14. Di Benedetto, A. (2015). Un’analisi del concetto di qualità della ricerca nella Vqr. Sociologia e ricerca sociale, 108: 95-112. DOI: 10.3280/SR2015-108005
  15. Endersby J.W. (1996). Collaborative research in the social sciences: multiple authorship and paper credit. Social Science Quarterly, 77: 375-392.
  16. Fasanella A., Di Benedetto A. (2014). Luci ed ombre nella VQR 2004- 2010: un focus sulla scheda di valutazione peer nell’Area 14, Sociologia e ricerca sociale, 104: 59-84. DOI: 10.3280/SR2014-104003
  17. Georghiou L., Howells J., Rigby J., Glynn S., Butler J., Cameron H., Cunningham P., Ausadamongkol K., Thomas D., Salazar A., Barker K., Reeve N. (2000). Impact of the Research Assessment Exercise and the Future of Quality Assurance in the Light of Changes in the Research Landscape. Manchester: PREST, University of Manchester.
  18. Geuna A., Martin B. (2003). University Research Evaluation and Funding: An International Comparison. Minerva, 41: 277-304.
  19. Gläser J., Laudel G., Hinze S., Butler L. (2002). Impact of evaluation-based founding on the production of scientific knowledge: what to worry about and how to find out, 31, Expertise for the German Ministry for Education and Research.
  20. Guthrie S., Wamae W., Diepeveen S., Wooding S., e Grant J. (2013). Measuring research: A guide to research evaluation frameworks and tools. Santa Monica, CA, RAND Corporation. Reperibile al link: http://cssip.org/docs/meeting/RAND_Full.pdf.
  21. Hicks D. (1999). The difficulty of achieving full coverage of international social science literature and the bibliometric consequences. Scientometrics, 44: 193-215.
  22. Hicks D. (2004). The four literatures of social science. In: Moed, H., Glänzel, W., Schmoch, U., a cura di, Handbook of Quantitative Science and Technology Research: The Use of Publication and Patent Statistics in Studies of S&T Systems. Dordrecht: Kluwer Academic.
  23. Hunter L., Leahey E. (2008). Collaborative Research in Sociology: Trends and Contributing Factors. The American Sociologist, 39: 290–306.
  24. Jimenez-Contreas E., de Moja Anegon F., Lopez-Cozar E.D. (2003). The evolution of re-search activity in Spain. The impact of the National Commission for the Evaluation of Research Activity (CNEAI). Research policy, 32: 123-142. DOI :10.1016/S0048-7333(02)00008-2.
  25. La Rocca, C. (2013). Commisurare la ricerca. Piccola teleologia della neovalutazione. Aut Aut, 360: 69-108.
  26. Leahey E., Reikowsky R. C. (2008). Research Specialization and Collaboration Patterns in Sociology. Social Studies of Science, 38: 425-440. DOI: 10.1177/0306312707086190
  27. Liefner I. (2003). Funding, Resources Allocation, and Performance in Higher Education Systems. Higher Education, 46: 469-489.
  28. Macilwain C. (2010). Wild goose chase. Nature, 463: 72-79.
  29. Merton R.K. (1968). The Matthew effect in science. Science, 159: 56-63.
  30. Moed H. (2005). Citation Analysis in Research Evaluation. Netherlands: Springer Science & Business Media.
  31. Moed H. (2008). UK Research Assessment Exercises: Informed judgments on research quality or quantity? Scientometrics, 74:153-161.
  32. Moody J. (2004). The Structure of Social Science Collaboration Network: Disciplinary Cohesion from 1963 to 1999. American Sociological Review, 69: 213-238. DOI: 10.1177/000312240406900204
  33. Ossenblok T.L.B., Verleysen F.T., Engels T.C.E. (2014). Coauthorship of Journal Articles and Book Chapters in the Social Sciences and Humanities (2000–2010). Journal Of The Association For Information Science And Technology, 65: 882–897.
  34. Osuna C., Cruz Castro L., SanzMenédez L. (2010). Knocking down some Assumptions about the Effects of Evaluation Systems on Publications. Instituto de Políticas y BienesPúblicos (IPP), CHS-CSIC, Working Paper, Number 10.
  35. Reale E. (2013). La valutazione della ricerca e il cambiamento dell’università. Sociologia e ricerca sociale, 100: 148-159. DOI: 10.3280/SR2013-100014
  36. Rebora G. (2012). Venti anni dopo. Il percorso della valutazione dell’università in Italia e alcune proposte per il futuro. Liuc Papers n. 257, Serie Economia aziendale 38.
  37. Schneider J.W. (2009). An outline of the bibliometric indicator used for performance-based funding of research institutions in Norway. European Political Science, 8: 364-378.
  38. Talib A. (2001). The Continuing Behavioural Modification of Academics since the 1992 Research Assessment Exercise. Highter Education Review, 33: 30-46.
  39. Westerheijden D. (1997). A Solid Base for Decisions: Use of the VSNU research Evaluation in Dutch Universities. Higher Education, 33: 397-413.
  40. Whitley R. (2007). The changing governance of the public sciences: the consequences of research evaluation systems for Knowledge production in different countries and scientific fields. In: Whitley, R., Glaser, J., a cura di, The changing governance of the sciences. The advent of the Research evaluation systems. Dordrecht: Springer.
  41. Whitley R., Glaser J., a cura di (2007). The changing governance of the sciences. The advent of the Research evaluation systems. Dordrecht: Springer.

Simona Colarusso, Annalisa Di Benedetto, Caratteristiche, tendenze e mutamenti della produzione scientifica sociologica nell’era della valutazione. Analisi esplorativa di un caso di studio in "RIV Rassegna Italiana di Valutazione" 66/2016, pp 120-138, DOI: 10.3280/RIV2016-066008