Abstract
The paper uses the concept of ‘parametric game’ to describe a strategy of academic work which has become wide-spread in Poland, due to the introduction of a particular policy of rewards. The ‘parametric game’ consists of doing research and publishing results in so as to enable fulfilling the requirements of a given research evaluation system. On the whole, researchers can use two main strategies in this game. The first strategy is ‘Impactitis,’ where only publications in journals with a high Impact Factor are acknowledged by a given scholarly community. The other strategy, whose definition and understanding are put forward in the present paper, is the so-called ‘running for points’ (the original Polish term being ‘punktoza’). In this strategy, the most ‘profitable’ choice is to publish several articles in journals without any Impact Factor rather than one paper in a toptier journal. On the basis of the Polish system, I present the main mechanisms that produce the running-for-points strategy and several mechanisms that make it possible to reduce the negative consequences of playing the parametric game in this way. The article concludes with a discussion of the Polish system. It suggests how the system can be improved and what lessons can be learnt from it. The conclusions are relevant for both scholars and policy makers in other countries.
References
Aagaard, K. (2015). How incentives trickle down: Local use of a national bibliometric indicator system. Science and Public Policy. 42(5), 725-737. doi: 10.1093/scipol/scu087.
Aagaard, K., Schneider, J.W. (2015). Research funding and national academic performance: Examination of a Danish success story. Science and Public Policy. 1-14. doi:10.1093/ scipol/scv058.
Afeltowicz Ł., Sojak, R. (2015) Arystokraci i rzemieślnicy. Synergia stylów badawczych. Toruń: Wyd. Nauk. UMK.
Antonowicz, D., Brzeziński, J.M. (2015). W poszukiwaniu optymalnego modelu szkolnictwa wyższego: w poszukiwaniu optymalnego modelu szkolnictwa wyższego. W: M.S. Szczepański, K. Szafraniec, A. Śliz (red.). Szkolnictwo wyższe, uniwersytet, kształcenie akademickie o obliczu koniecznej zmiany: ekspertyza Komitetu Socjologii Polskiej Akademii Nauk (95-119). Warszawa: PAN.
Bal, R. (2017). Playing the Indicator Game: Reflections on Strategies to Position an STS Group in a Multi-disciplinary Environment. Engaging Science, Technology, and Society. 3: 41. doi: 10.17351/ests2017.111.
Brzeziński, J.M. (2016). Przeciwko depersonalizacji i nadmiernej standaryzacji procesu ewaluacji w nauce. Zagadnienia Naukoznawstwa. 1(52): 127-138.
Campbell, D.T. (1979). Assessing the Impact of Planned Social Change. Evaluation and Program Planning. 2: 67-90.
Chowdhury, G., Koya, K., Philipson, P. (2016). Measuring the Impact of Research: Lessons from the UK’s Research Excellence Framework 2014. Plos One. 11(6). e0156978. doi:10.1371/journal.pone.0156978.
Diest, P.J. van, Holzel, H., Burnett, D., Crocker, J. (2001). Impactitis: New cures for an old disease. Journal of Clinical Pathology. 54(11): 817-819. doi: 10.1136/jcp.54.11.817.
Elsaie, M., Kammer, J. (2009). Impactitis: The impact factor myth syndrome. Indian Journal of Dermatology. 54(1), 83-86. doi:10.4103/0019-5154.48998.
Fochler, M., Rijcke, S. de (2017). Implicated in the Indicator Game? An experimental Debate. Ests. 1: 21-40. doi:10.17351/ests2017.108.
Giménez-Toledo, E., Mañana-Rodríguez, J., Sivertsen, G. (2017). Scholarly book publishing: Its information sources for evaluation in the social sciences and humanities. Research Evaluation. 0(0): 1-11. doi: 10.1093/reseval/rvx007.
Godin, B. (2009). The value of science: changing conceptions of scientific productivity, 1869 to circa 1970. Social Science Information. 48(4): 547-586. doi:10.1177/0539018409344475.
Good, B., Vermeulen, N., Tiefenthaler, B., Arnold, E. (2015). Counting quality? The Czech performance-based research funding system. Research Evaluation. 24(2): 91-105. doi: 10.1093/reseval/rvu035.
Hicks, D. (2012). Performance-based university research funding systems. Research Policy. 41(2): 251-261. doi: 10.1016/j.respol.2011.09.007.
Hicks, D., Wouters, P., Waltman, L., Rijcke, S. de, Rafols, I. (2015). Bibliometrics: The Leiden Manifesto for research metrics. Nature. 520(7548): 429-431. doi: 10.1038/520429a.
Irwin, A. (2017). If the Indicator Game is the Answer, Then What is the Question? Engaging Science, Technology, and Society. 3: 64. doi: 10.17351/ests2017.110.
Jabłecka, J., Lepori, B. (2009). Between historical heritage and policy learning: The reform of public research funding systems in Poland, 1989-2007. Science and Public Policy. 36(9): 697-708. doi: 10.3152/030234209X475263.
Kulczycki, E. (2016). Rethinking Open Science: The Role of Communication. Analele Universitatii din Craiova, Seria Filosofie. 37(1): 81-97.
Kulczycki, E. (2017). Assessing Publications through a Bibliometric Indicator: The Case of Comprehensive Evaluation of Scientific Units in Poland. Research Evaluation. doi: 10.1093/reseval/rvw023.
Kwiek, M. (2014). Structural changes in the Polish higher education system (1990-2010): a synthetic view. European Journal of Higher Education. 4(3): 266-280. doi: 10.1080/ 21568235.2014.905965.
Kwiek, M. (2015) Podzielony uniwersytet. Od deinstytucjonalizacji do reinstytucjonalizacji misji badawczej polskich uczelni. Nauka i Szkolnictwo Wyższe. 2(46): 41-74. doi: http://dx.doi.org/10.14746/nsw.2015.2.2.
Larivière, V., Haustein, S., Mongeon, P., Solla Price, D. de, Haustein, S., Tenopir, C. i in. (2015). The Oligopoly of Academic Publishers in the Digital Era. Plos One. 10(6): e0127502. doi: 10.1371/journal.pone.0127502.
Łomnicki, A. (1997). A Polish Perspective on Peer Review. W: M.S. Frankel, J. Cave (red.). Evaluating Science and Scientists (61-70). Budapest: Central European University Press.
Nabout, J.C., Parreira, M.R., Teresa, F.B., Carneiro, F.M., Cunha, H.F. da, Souza Ondei, L. de i in. (2014). Publish (in a group) or perish (alone): The trend from single- to multiauthorship in biological papers. Scientometrics. 102(1): 357-364. doi: 10.1007/s11192-014-1385-5.
Ostrowicka, H., Spychalska-Stasiak, J. (2017). Uodpowiedzialnianie akademii – formacje wiedzy i władza parametryzacji w dyskursie akademickim. Nauka i Szkolnictwo Wyższe. 1(49): 105-131. doi: 10.14746/nisw.2017.1.6.
Rijcke, S. de, Wouters, P.F., Rushforth, A.D., Franssen, T.P., Hammarfelt, B. (2016). Evaluation practices and effects of indicator use: A literature review. Research Evaluation. 25(2): 161-169. doi: 10.1093/reseval/rvv038.
Schekman, R. (2013). How journals like Nature, Cell and Science are damaging science. https://www.theguardian.com/commentisfree/2013/dec/09/how-journals-nature-science-cell-damage-science [16.01.2017].
Schneider, J.W., Aagaard, K., Bloch, C.W. (2014). What happens when funding is linked to (differentiated) publication counts? New insights from an evaluation of the Norwegian Publication Indicator. W: E. Noyons (red.). Proceedings of the science and technology indicators conference 2014 Leiden „Context Counts: Pathways to Master Big and Little Data” (543-550). Leiden: Universiteit Leiden.
Stern, N. (2016). Building on Success and Learning from Experience: An Independent Review of the Research Excellence Framework. Department for Business, Energy & Industrial Strategy.
Szadkowski, K. (2015). Uniwersytet jako dobro wspólne. Podstawy krytycznych badań nad szkolnictwem wyższym. Warszawa: Wyd. Nauk. PWN.
Szadkowski, K. (2016). Socially Necessary Impact/Time: Notes on The Acceleration of Academic Labor, Metrics and The Transnational Association of Capitals. Teorie Vědy/ Theory of Science. 38(1): 53-85.
Towpik, E. (2015). IF-mania: Journal Impact Factor nie jest właściwym wskaźnikiem oceniania wyników badań naukowych, indywidualnych uczonych ani ośrodków badawczych. Nowotwory. Journal of Oncology. 65: 465-475.
Vostal, F. (2016). Accelerating Academia: The Changing Structure of Academic Time. Basingstoke: Palgrave Macmillan. Warczok, T., Zarycki, T. (2016). Polska politologia w globalnym polu nauk społecznych. Warszawa: Scholar.
Wilsdon, J., Allen, L., Belfiore, E., Campbell, P., Curry, S., Hill, S. i in. (2015). The Metric Tide: Report of the Independent Review of the Role of Metrics in Research Assessment and Management. HEFCE. doi: 10.13140/RG.2.1.4929.1363.
Wróblewski, A.K. (2017). Nie wszystko, co się liczy, da się policzyć… Nauka. 1: 7-22.
Zabel, M. (2013). Gorączka ewaluacji. Forum Akademickie. 10: 30-35.