2689-5870 Scholarly Assessment Reports 2689-5870 Levy Library Press 10.29024/sar.15 Research The New Research Assessment Reform in China and Its Implementation https://orcid.org/0000-0003-0526-9677 Zhang Lin 1 https://orcid.org/0000-0003-1020-3189 Sivertsen Gunnar gunnar.sivertsen@nifu.no 2 School of Information Management, Wuhan University, CN Nordic Institute for Studies in Innovation, Research and Education, Oslo, NO 12 05 2020 2020 2 1 3 01 04 2020 23 04 2020 Copyright: © 2020 The Author(s) 2020 This is an open-access article distributed under the terms of the Creative Commons Attribution 4.0 International License (CC-BY 4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. See http://creativecommons.org/licenses/by/4.0/.

A radical reform of research assessment was recently launched in China. It seeks to replace a focus on Web of Science-based indicators with a balanced combination of qualitative and quantitative research evaluation, and to strengthen the local relevance of research in China. It trusts the institutions to implement the policy within a few months but does not provide the necessary national platforms for coordination, influence and collaboration on developing shared tools and information resources and for agreement on definitions, criteria and protocols for the procedures. Based on international experiences, this article provides constructive ideas for the implementation of the new policy.

Policy highlights

In response to the three main messages of the new policy, we suggest these possible solutions for the implementation:

Farewell to “SCI worship”: With the move away from Web of Science as a standard, an integrated research information system and a national journal evaluation system is needed.

From metrics to peer review: The function and weight of peer-review evaluation needs to be differentiated between the levels of the research system: individuals, units, institutions, and national agencies.

New priority to local relevance: The optimal balance between globalization and local relevance must be allowed to differ by type and field of research.

Research evaluation Research policy Peer review Bibliometric Indicators Journal evaluation Research information system Research integrity China
Introduction

The Ministry of Science and Technology and the Ministry of Education in China recently published two policy documents (Ministry of Education 2020; MOST 2020) that have aroused intense discussion among Chinese academics and gained worldwide interest as well (Editorial in Nature 2020; Mallapaty 2020). The new policy aims to restore “the scientific spirit, innovation quality, and service contribution” of research and to “promote the return of universities to their original academic aims” (MOST 2020). The point of departure is that, for the last decade, China’s research evaluation and funding policies have had a strong focus on quantitative indicators and publications in journals covered by the Web of Science. Indeed, it was partly this focus that helped China surpass the USA as the largest contributing nation to international scientific journals (Tollefson 2018). However, the government now wishes to balance further internationalization with domestic needs and its traditional quantitative evaluation methods with qualitative peer review. The main messages are:

Farewell to “SCI worship”. Indicators based on Web of Science will not be applied directly in evaluation and funding at any level. An alternative citation index with Chinese characteristics and international influence will be established.

From metrics to peer review. A new focus on novelty, scientific value, research integrity, innovation potential and societal outcomes will replace the “paper only” orientation in panel evaluations. Publications will be presented for review as a limited set of “representative work” with explicit relevance for the evaluation. Number of publications and journal impact factors will not count any more.

New priority to local relevance. Publications in high-quality Chinese journals will be encouraged, and the development of such journals will be supported.

The burning question in China now is how the new policy will be implemented and with what consequences. The universities are urged to implement the policy locally by the end of July at the latest. The government trusts them to find their own solutions in doing so, which is a good sign of respect for autonomy. However, the need for national coordination and services seems to be underestimated. We will relate some problems of implementation and their possible solutions to the three points above.

The background for the new policy

The two new policy documents did not come out of the blue. In recent years, there have been several calls to change research evaluation and funding protocols from a quantitative to a more qualitative focus in China. In 2016, President Xi Jinping called for reform towards a more comprehensive evaluation system for individual researchers (Xinhua News Agency 2016). Further, in 2018, a document issued by the joint force of three ministries and two national central institutions specifically proposed moving away from the “Four only” phenomenon of “only papers, only titles, only diplomas and only awards” (MOST 2018). Among these, the “only papers” focus has received the broadest attention, for good reasons.

The phenomenon has been called “SCI worship” in China for a long time (the original name of Web of Science was the Science Citation Index). Publications indexed in WoS and WoS-based indicators (e.g., the Journal Impact Factor and ESI highly cited papers) have become the core indicators for research evaluation, staff employment, career promotion, awards, university or disciplinary rankings, funding and resource allocation during the past years. Even individual cash incentives for WoS publications are widespread (Quan, Chen & Shu 2017).

The effects have been twofold. On the one hand, Chinese researchers have been encouraged to publish according to world standards and communicate more broadly and visibly with international communities. Chinese researchers have benefited from the advice gained from international peer-review processes and improved their research performance. On the other hand, the heavy reliance on “SCI papers” has been much debated as a form of goal displacement. Some individual researchers and institutions have pursued high numbers of publications while disregarding the quality and societal value of their research, even at the cost of research integrity, which has become a major concern for the Chinese government.

More than half of the world’s scientific articles are published in “international” journals owned by publishers in the Netherlands, the UK and the USA. China is now the largest contributing country to these journals, but only around 200 of the 11,000 indexed journals in Web of Science are published in China. This imbalance has caused attention in China. As one of the leading countries in science and technology, publishing academic journals with international reputation is still an outstanding goal to achieve. A challenge for the growth of domestic journals in China has been to attract high-quality submissions, partly due to the fact that publications in Chinese journals were not valued much in research evaluations. “SCI worship” is perceived as a vicious circle for raising the standards and profile of domestic journals.

Despite the rapid development of open-access publishing all over the world, much of China’s scientific output is still locked behind paywalls. “NSFC funds about 70% of Chinese research articles published in international journals, but China has to buy them back with full and high prices”, said XiaoLin Zhang, chair of the Strategic Planning Committee of the National Science and Technology Library at the Ministry of Science and Technology in Beijing (Schiermeier 2018). Due to paywalls and language barriers, papers published by Chinese researchers in international journals are rarely read by general Chinese audiences. This is interpreted as a lack of benefit of research to the Chinese society and is part of the explanation for the increased focus on domestic publishing in the new evaluation and funding policy.

The new direction in China has similarities with initiatives in other parts of the world, such as the DORA declaration on research assessment (DORA 2013), the Leiden Manifesto for research metrics (Hicks et al. 2015), and the EU policy for Responsible Research and Innovation (European Commission 2020). None of these initiatives have resulted in easy solutions in any country. Instead, there are international collaboration projects among funders, institutions and countries for developing common standards and information sources for holistic evaluations that go beyond publication and citation metrics. The aim is to ensure fairness and predictability as researchers are mobile between collaboration networks, funders, institutions and countries.

We believe that the new policy in China will lay the ground for more mutual understanding, learning and collaboration between countries in the domain of research evaluation and funding. Other countries will not only recognize the positive scientific and societal values that the new Chinese policy is based on, but also some problems with implementation that we will discuss here in relation to the three main messages of the policy listed in our introduction.

Farewell to “SCI worship”

By moving away from Web of Science (or Scopus) as a standard for their national research evaluation and funding system, China is empowering its own academic communities, research institutions and funding organizations in defining the principles, criteria and protocols for evaluation. Essentially, the country is moving from a commercial product-based system to a self-determined and self-organized criteria-based system (Aksnes & Sivertsen 2019). To fulfil this move, an integrated research information system and a national journal evaluation system is still needed.

The new policy is aware of the need for a journal evaluation system to replace the use of journal impact factors and to cover journals beyond Web of Science. However, the ideas about how to implement such a system need to be further developed. The policy speaks of blacklisting journals with questionable purposes and of giving extra weight to “three types of high-quality papers, including those published in domestic scientific journals with international impact, internationally recognized top-level or important scientific journals, and papers reported at top academic conferences in China and abroad” (MOST 2020). The selections are supposed to be narrow and made locally by “the academic committee of the unit” without clearer criteria than in these sentences.

Other existing initiatives in the same direction will need to be coordinated. A narrow selection of journals is already launched by the “Chinese Science and Technology Journal Excellence Action Plan”, which was released in 2019 with the aim of improving the quality of a selected number of scientific and technological journals (285 journals in total), as well as to speed up the process of establishing those Chinese journals as world-class (CAST 2019). In addition, China has citation indexing services that cover selections of domestic scientific journals. These services even include the social sciences and humanities. The services are organized differently by different institutions or companies. They are ‘products’ with different criteria for inclusion of journals, and there is no direct influence of the Chinese academic communities on their criteria. Now, the new policy wants to establish an alternative Citation Index with “Chinese characteristics and international influence”, to replace the SCI where Chinese journals are underrepresented. How can all these initiatives be coordinated?

To set a new legitimate standard with transparent procedures, there is a need to create a comprehensive list of acknowledged journals representing a continuum of all research fields and including both domestic and international journals while taking care of marginal fields and interdisciplinarity. The list must be dynamic to reflect a changing journal market, and it needs to be organized to represent a balanced influence of expert advice by all disciplines in China through inter-institutional representative bodies. The same organization of expert advice is needed if extra weight is to be given to specific selections of journals on the comprehensive list. Examples of such dynamic lists already exist in several non-English speaking countries in Africa, Asia, Europe and Latin-America, e.g. the Latindex and the Nordic list.

The new policy demands a broader perspective in research assessment on novelty, scientific value, research integrity, innovation potential and societal outcomes. How will panels be informed about and be able to compare such achievements? There is need for more comprehensive sources of information to supplement Web of Science. These sources could be integrated in a national Current Research Information System (CRIS). We will shortly describe the idea, based on Sivertsen (2019), and the potentials and challenges for its implementation in China.

Current research information systems (CRIS) are databases or other information systems used within and among research organizations to store, manage, and exchange data for documentation, communication, and administration of research activities. CRIS usually contain information about researchers and research groups, their projects, funding, outputs, and outcomes. In the most advanced versions, CRIS help produce integrated data for what used to be documents for separate purposes, such as individual applications for funding, institutional annual reports, project reports, CVs, publications lists, profiles of research groups, project reports, information for media and the general public, etc. Searchable bibliographic references may lead to full texts in local repositories.

With an integrated CRIS providing data that are structured and quality assured for statistical purposes, research performing and funding organizations may also use CRIS for monitoring and evaluating research activities and outputs, allocating funding, supporting decision making on their policies and strategies, tracking researchers’ careers, and describing their systemic role to policy-makers, stakeholders, and the public.

A CRIS has information about identifiable persons (not only authors). This would solve the author-name disambiguation problem in China. It has updated institutional affiliations, titles, and positions (not only published addresses), which may support and reflect mobility in a more efficient way. It can have more complete information than is available in funding acknowledgements in publications, such as CVs, projects, networks, and memberships. A CRIS may also cover publications more comprehensively than existing bibliographic data sources do. A Chinese CRIS could integrate the international and domestic indexing services mentioned above and go beyond them within a limit set by definitions and criteria (Aksnes & Sivertsen 2019).

CRIS are now widespread in the world, but most of them operate at the institutional level only and are closed systems. The existing commercial solutions, e.g. Converis from Clarivate and Pure from Elsevier, are designed for such local use only. It seems to be a challenge, mainly in the larger countries, to agree on an integrated and open national solution. However, a few countries, e.g. Brazil, Czech Republic, Finland, New Zealand, and Norway, have managed to integrate a CRIS nationally with the help of non-commercial solutions.

In China, most Chinese universities and other research organizations already have their own local systems. An example is Shanghai Tech University with one of the most advanced technical CRIS solutions worldwide, but only for local use (kms.shanghaitech.edu.cn). At the national level, the Ministry of Education (MoE), the Ministry of Science and Technology (MOST), and the National Natural Science Foundation of China (NSFC) have built databases for research and education information (with different focuses, strengths and coverages), but they are mainly used as internal sources.

An integrated research information system in China would relieve individual researchers and institutions from providing all the information themselves every time they are evaluated. The national research information system should comprise both international and domestic scientific publications, and other types of research outputs and information such as books for teaching and general audiences, inventions, education, government advice and interaction with culture, society and industry.

From metrics to peer review

The new policy targets evaluation and funding at all levels in the Chinese research and innovation system. It seems to us that the ideas in the new policy for evaluation at more aggregated levels (institutions, thematic programs, research sectors) are modeled on the relevant methods for individual-level research assessment. The background might be that up to now, a predictable “evaluation currency” of indicators based on Web of Science has been used at all levels in the Chinese research system. With the present move from metrics to peer review, a multi-level application model for roles and procedures in research evaluation is needed (Moed 2020). According to this model, the institutions themselves in their internal assessment and funding processes combine indicators and expert knowledge to evaluate individuals and groups. Externally, national meta-institutional agencies use aggregate indicators at the level of institutions and thereby assess the evaluation and funding processes inside institutions. Following this model, it would be inappropriate and in conflict with institutional autonomy for a meta-institutional agency to be concerned directly with the assessment of individuals or groups at the institutional level.

As an example, the new policy requires a small maximum number of representative publications to be read and evaluated at all levels. We are not sure that individual publications need to be read in evaluations at all levels and for all purposes in the world’s largest research system. In Europe, there are some countries with national evaluation systems where scientific publications are read and evaluated for the third or fourth time by expert committees (before publication, before recruitment or promotion, before external funding, and then again in the national evaluation) even if it has been proved that metrics would give approximately the same evaluation outcome at the aggregated level (Harzing 2019; Traag & Waltman 2019).

We think the new policy needs to differentiate more clearly between appropriate methods at different levels of aggregation. Depending on the purpose, metrics can be quite useful at aggregate levels (Sivertsen 2017), and the data and indicators do not need to be limited to WoS. Alternative or supplementary data sources already exist that have a more comprehensive coverage of all fields and of both domestic and international publishing (Engels & Guns 2018; Pölönen 2018; Sivertsen 2018a), and indicators exist that can provide a more balanced representation of the publishing patterns in all fields (Sivertsen, Rousseau & Zhang 2019). Such indicators are usually only applicable at aggregate levels (Aagaard 2015).

Different criteria and procedures will also need to be applied according to the type and field of research, and the purpose of the evaluation, e.g., recruitment or promotion versus external project funding versus program evaluation. Evaluation protocols, as well as rules for appointing expert peers, will need to be drawn for these different contexts.

The policy seems to imply that the universities themselves should be responsible for developing evaluation protocols with specification criteria and procedures. We support this trust in institutional autonomy but still think inter-institutional disciplinary collaboration will be important in developing the protocols and putting them into practice. External peer review may be needed. In small countries, ‘external’ usually means inviting experts from other countries. In larger countries, e.g. in the Research Excellence Framework of the United Kingdom, the invited experts must at least come from different institutions. In our view, a combination is preferable. National experts are needed to provide insight into local research conditions and specializations while experts from abroad are needed for external perspectives and independent views. A completely internal institutional peer review system may suffer from cronyism and bias due to conflicts of interest. This may be a concern among young researchers in China, particularly, who are now accustomed to the verdicts of international peers when they publish.

Furthermore, moving towards peer review allows for formative evaluations. A formative evaluation learns from the past (strengths and weaknesses), looks forward (opportunities, threats) and serves strategic development. A summative evaluation looks at past performance, checks whether goals or expectations have been reached, and serves accountancy, decisions and/or resource allocation (Scriven 1967; Lepori & Reale 2012; Sivertsen 2017). Summative evaluation has dominated in China. The idea of formative evaluation is not present in the new policy. National coordination could allow for organizational learning at the aggregate level of institutions and for inter-institutional disciplinary collaboration. Institutions may have different strengths and can improve from collaboration even more than from competition.

In addition, some of the criteria that the new policy wants to stress instead of ‘only papers’ are difficult to record and measure, e.g., innovation and collaboration with industry, improving health outcomes, professional education and other forms of societal impact. Depending on the profile and purpose of a university, such potential outcomes of research are often explicitly stated in the aims of the institution. A formative evaluation could be focused on how the institution organizes and performs according to such aims (Sivertsen & Meijer 2019). Moreover, an evaluation of what every researcher does to reach the same aims might not be needed. Individual researchers might have different roles, talents and opportunities to accomplish important societal aims of research. An organizational level evaluation would allow for such differences and still give advice for improvement at the university level.

Local relevance

For some years now, there has been a concern that research funded and performed in China and expected to be useful for Chinese society is published in English in very distant journals. Recently, there was a debate about the fact that one of the first scientific articles carrying an early warning of the Coronavirus was published by Chinese scientists in a Western international journal before the general public in China was informed about the epidemic (CDC 2020). The debate turned into a more general discussion of whether new scientific results from China should be firstly published in international or domestic journals. This was one of the main controversies in Chinese social media in the early stage of the Coronavirus outbreak. Those against international publishing argued that it would delay the immediate use of the new knowledge needed to control the epidemic in China (Zhang et al. 2020). This reaction in the general public is understandable, but experts in the field will know that international and local publishing cannot replace each other. Both are needed, and it is a question of balance. Time has also shown that global exchange of information and advice is crucial to stop the Corona epidemic itself as it reaches other countries and continents.

The new policy encourages researchers to publish more in domestic journals. It even specifies that “in principle, when researchers provide representative publication lists, papers from domestic journals should account for at least one third of all the publications”. In our view, this criterion might work as a general policy aim but needs to be applied with differentiation according to field and type of research and the purpose of communication. As examples, most studies in law will be published in national law journals in a country’s native language because the research is most often about national law and aims to serve the country’s legal system. On the other hand, most studies in astrophysics are published in English in all countries to ensure global research communication on topics that all countries share. But these publications in English are not preventing astrophysicists all over the world from communicating their knowledge to a general public in their native language. Publications in English are not in themselves making Chinese research less useful to the Chinese society.

There is also need to specify what ‘domestic journals’ means in China. There are numerous professional or general journals where researchers publish in Chinese for broader audiences. In addition, all disciplines have several scientific journals where researchers publish in Chinese for domestic research communication. This is a strong tradition in China with a clear hierarchy of prestige among journals in each field. While there is need for more journals to compete for good manuscripts at the top, the less important journals are abundant and in need of improvement of the general quality of their procedures and contents. Finally, an increasing number of scientific journals are edited at, and published by, Chinese research organizations to serve international research communication among authors from all countries. Some of them operate with so-called ‘diamond’ Open Access, i.e. without article processing charges, and might have international success for this reason. This third group seems to be a meaningful target as the new policy speaks of “three types of high-quality papers, including those published in domestic scientific journals with international impact” (MOST 2020).

The balance between globalization and local relevance needs to be empirical and dynamic, that is reflecting a statistically informed policy for reasonable change (Sivertsen 2018b). Although China is now the largest contributing country to publications indexed in Scopus (Tollefson 2018), most scientific publications are still published in Chinese in China, with variations among fields. Some fields are internationally visible and impactful, others are not. The annual volume of domestic articles indexed by the Chinese Social Sciences Citation Index (CSSCI) is still around ten times higher than the annual volume of articles from China indexed by the Social Science Citation Index for the Web of Science (Zhang et al., 2020). Young researchers are generally more active in publishing internationally than older researchers. Hence, the new policy resonates differently in the academic community. Some researchers are happy to leave behind the policy of globalization. Others are concerned that support for collaborating and publishing abroad will be taken away from them. China needs to find a differentiated and dynamic balance between local relevance and globalization of research.

Conclusion

To conclude, the new evaluation and funding policy in China makes an important effort to replace a focus on WoS-based indicators with a balanced combination of qualitative and quantitative research evaluation, and to strengthen the local relevance of research in China. It trusts the institutions to implement the policy quickly but does not provide the necessary national platforms for coordination, influence and collaboration on developing shared information resources and tools and agreed definitions and protocols. In response to the three main messages of the new policy, this is our summary of possible solutions to the implementation problems:

With the move away from Web of Science as a standard, a national research information system and a journal evaluation system will be needed. The journal evaluation system may support the legitimacy and transparency of the general evaluation and funding system and help to promote an empirical and dynamic balance between globalization and local relevance. It will also encourage domestic Chinese journals to perform better. A national research information system would provide research assessment with relevant information according to its new broader scope, not only a more comprehensive coverage of publications, but also information about innovation and collaboration with industry, improving health outcomes, professional education and other forms of societal impact.

The function and weight of peer-review evaluation needs to be differentiated between different levels in the research system. At aggregate levels, responsible use of metrics can be relevant. The metrics need not be based on the Web of Science. Alternative indicators already exist that have a more balanced representation of all fields, and of both domestic and international publishing. Instead of reading papers, expert advice can be used in formative organizational-level evaluations focusing on what is done to support individual researchers in achieving the aims of the organization.

The balance between globalization and local relevance needs to be empirical and dynamic, that is reflecting a statistically informed policy for reasonable change, considering that the optimal balance will differ by type and field of research. Researchers should be confident in finding a relevant publishing practice where balanced international and domestic publishing contribute together to research quality and local relevance.

Researchers and their organizations should not only be subjected to research evaluation. They should also be involved in deciding the criteria and designing the evaluation protocols. Collaboration between research performing and research funding organizations, and with representative disciplinary bodies at the national level, is needed to develop shared criteria and protocols for use in different contexts at the local level.

With these implementations of the new policy, we believe that researchers in all fields and their different organizations will more easily identify with the new criteria and procedures for research evaluation and funding. This might take away some of the burden of the current evaluation system. The burden can also be reduced by general changes in governance. The performance-based portion of funding allocation can be reduced as a move from continuous control to trust-based governance of projects and organizations. By trusting researchers and their organizations to spend their resources efficiently and responsibly, more funding can be spent with long-term predictability for projects and organizations. This could also contribute to the main aims of the new policy, which is to re-establish the original scientific and societal values to guide research in China.

Competing Interests

The authors have no competing interests to declare.

Aagaard, K. (2015). How incentives trickle down: Local use of a national bibliometric indicator system. Science and Public Policy, 42(5), 725-737. DOI: 10.1093/scipol/scu087 Aksnes, D. W., & Sivertsen, G. (2019). A criteria-based assessment of the coverage of Scopus and Web of Science. Journal of Data and Information Science, 4(1), 1-21. DOI: 10.2478/jdis-2019-0001 CAST. (2019). [in Chinese] “Notice of the journal lists according to Chinese science and technology journal excellence action plan”. Retrieved from http://www.cast.org.cn/art/2019/11/25/art_458_105664.html CDC. (2020). [in Chinese] “Response from Chinese Center for Disease Control and Prevention for a NEJM publication”. Retrieved from http://tech.gmw.cn/2020-01/31/content_33513568.htm DORA. (2013). Retrieved from https://sfdora.org/ Editorial in Nature. (2020). China’s research-evaluation revamp should not mean fewer international collaborations. Nature, 579, 8. DOI: 10.1038/d41586-020-00625-0 Engels, T. C. E., & Guns, R. (2018). The Flemish Performance-based Research Funding System: A Unique Variant of the Norwegian Model. Journal of Data and Information Science, 3(4): 45-60. DOI: 10.2478/jdis-2018-0020 European Commission. (2020). Responsible research & innovation in Horizon 2020. Retrieved from https://ec.europa.eu/programmes/horizon2020/en/h2020-section/responsible-research-innovation Harzing, A. W. (2019). Running the REF on a rainy Sunday afternoon: Do metrics match peer review? Retrieved from https://harzing.com/publications/white-papers/running-the-ref-on-a-rainy-sunday-afternoon-do-metrics-match-peer-review Hicks, D., Wouters, P., Waltman, L., de Rijcke, S., & Rafols, I. (2015). Bibliometrics: The Leiden Manifesto for research metrics. Nature, 520, 429-431. DOI: 10.1038/520429a Lepori, B., & Reale, E. (2012). S&T indicators as a tool for formative evaluation of research programs. Evaluation, 18(4), 451-465. DOI: 10.1177/1356389012460961 Mallapaty, S. (2020). China bans cash rewards for publishing papers. Nature, 579, 18. DOI: 10.1038/d41586-020-00574-8 Ministry of Education, PRC. (2020). [in Chinese] “Some opinions on standardizing the use of related indicators of SCI papers in universities and establishing a correct evaluation orientation”. Retrieved from http://www.moe.gov.cn/srcsite/A16/moe_784/202002/t20200223_423334.html Moed, H. F. (2020). Appropriate Use of Metrics in Research Assessment of Autonomous Academic Institutions. Scholarly Assessment Reports, 2(1), 1. DOI: 10.29024/sar.8 MOST. (2018). [in Chinese] “Implementation of the special action of clearing up ‘only paper, only title, only diploma, and only award’”. Retrieved from http://www.most.gov.cn/tztg/201810/t20181023_142389.htm MOST. (2020). [in Chinese] “Some suggestions to eliminate the bad orientation of ‘paper-only’ in scientific and technological evaluation (Trial)”. Retrieved from http://www.most.gov.cn/mostinfo/xinxifenlei/fgzc/gfxwj/gfxwj2020/202002/t20200223_151781.htm Pölönen, J. (2018). Applications of, and Experiences with, the Norwegian Model in Finland. Journal of Data and Information Science, 3(4), 31-44. DOI: 10.2478/jdis-2018-0019 Quan, W., Chen, B. K., & Shu, F. (2017). Publish or impoverish: An investigation of the monetary reward system of science in China (1999-2016). Aslib Journal of Information Management, 69(5), 486-502. DOI: 10.1108/AJIM-01-2017-0014 Schiermeier, Q. (2018). China backs bold plan to tear down journal paywalls. Nature, 564, 171-172. DOI: 10.1038/d41586-018-07659-5 Scriven, M. (1967). The methodology of evaluation. In R. W. Tyler, R. M. Gagné & M. Scriven (Eds.), Perspectives of curriculum evaluation (Vol. 1, pp. 39-83). Chicago, IL: Rand McNally. Sivertsen, G. (2017). Unique, but still best practice? The Research Excellence Framework (REF) from an international perspective. Palgrave Communications, 3, 17078. DOI: 10.1057/palcomms.2017.78 Sivertsen, G. (2018a). The Norwegian Model in Norway. Journal of Data and Information Science, 3(4): 3-19. DOI: 10.2478/jdis-2018-0017 Sivertsen, G. (2018b). Balanced multilingualism in science. BiD: textos universitaris de biblioteconomia i documentació, 40. DOI: 10.1344/BiD2018.40.25 Sivertsen, G. (2019). Developing Current Research Information Systems (CRIS) as Data Sources for Studies of Research. In W. Glänzel, H. F. Moed, U. Schmoch & M Thelwall (Eds.), Springer Handbook of Science and Technology Indicators (pp. 667-683). Cham: Springer. DOI: 10.1007/978-3-030-02511-3_25 Sivertsen, G., & Meijer, I. (2019). Normal versus extraordinary societal impact: how to understand, evaluate, and improve research activities in their relations to society? Research Evaluation, 29(1), 66-70. DOI: 10.1093/reseval/rvz032 Sivertsen, G., Rousseau, R., & Zhang, L. (2019). Measuring Scientific Production with Modified Fractional Counting. Journal of Informetrics, 13(2), 679-694. DOI: 10.1016/j.joi.2019.03.010 Tollefson, J. (2018). China declared world’s largest producer of scientific articles. Nature, 533, 390. DOI: 10.1038/d41586-018-00927-4 Traag, V. A., & Waltman, L. (2019). Systematic analysis of agreement between metrics and peer review in the UK REF. Palgrave Communications, 5, 29. DOI: 10.1057/s41599-019-0233-x Xinhua News Agency. (2016). [in Chinese] “Central Leading Group for Comprehensive Deepening Reform (Twenty-ninth meeting)”. Retrieved from http://www.gov.cn/xinwen/2016-11/01/content_5127202.htm Zhang, L., Shang, Y. Y., Huang, Y., Sivertsen, G. (2020). Toward internationalization: a bibliometric analysis of the social sciences in Mainland China from 1979 to 2018. Quantitative Science Studies, submitted for peer review. Zhang, L., Zhao, W. J., Sun, B. B., Huang, Y., & Glänzel, W. (2020). How scientific research reacts to international public health emergencies: a global analysis of response patterns. Scientometrics. DOI: 10.1007/s11192-020-03486-6