Government of the Czech Republic Methodology for Evaluating Research Organisations and Research, Development and Innovation Purpose-tied Aid Programmes Government of the Czech Republic Methodology for Evaluating Research Organisations and Research, Development and Innovation Purpose-tied Aid Programmes Approved under Czech Government Resolution No. 107 of 8 February 2017. RESEARCH, DEVELOPMENT AND INNOVATION COUNCIL DEPARTMENT Issued by: © Office of the Government of the Czech Republic, 2018 [Reprint, 1st ed.] Nábřeží Edvarda Beneše U, 118 01 Prague 1 ISBN 978-80-7440-214-2 (online: pdf] ISBN 978-80-7440-206-7 (brož.) Wl RESEARCH, DEVELOPMENT AND INNOVATION COUNCIL Government of the Czech Republic TABLE OF CONTENTS SUMMARY 6 Introduction 8 Main Goals, Purpose and Background of Evaluation 9 1. Basic Evaluation Modules 12 1.1 MODULE 1 - Quality of Selected Results 1 2 1.2 MODULE 2 - Research Performance 13 1.3 MODULE 3 - Social Relevance 13 1.4 MODULE 4 - Viability 14 1.5 MODULE 5 - Strategy and Policies 14 1.6 Definition of Results 14 2. National Level of Evaluation 1 5 2.1 Evaluation of Situation in R&D&I 15 2.2 Annual Evaluation of Results 1 5 2.3 Expert Oversight of Evaluation 16 2.4 Providing Conditions for Evaluation 1 6 2.5 Technical Supplies, IT Support 1 6 2.6 Evaluation Tools and Expert Panels 1 6 2.7 Evaluation Procedure and its Formal Aspects 1 9 2.8 Report for Research Organisation and Provider 21 2.9 Bias 21 3. Implementation Period 22 3.1 Principles of Funding in Implementation Period 24 3.2 Base Fixation 24 3.3 Distribution of Aid Increase by Evaluation 24- 4. Discussing Evaluation Results with Providers 26 4.1 Discussing Full Evaluation with Providers 26 4.2 Annual Discussion over National Evaluation with Provider 28 4.3 Annual Discussion of Evaluation with Provider in Implementation Phase 29 4.4 Funding 29 5. Evaluation in the Universities Segment 30 5.1 Evaluation Procedure 30 5.2 Assigning Qualitative Grades 30 5.3 Discussing Evaluation Results 30 6. Evaluation in Segment of Governmental Departments 31 6.1 Purpose of Evaluation 31 6.2 Provider Five-year Evaluation Cycle and Relation to Annual Evaluation of RO Results 31 6.3 Evaluation Underlying Documents 32 7. Evaluation of CAS 32 7.1 Current Situation 32 7.2 Objectives, Principles and Content of Evaluation 32 7.3 Future Development 33 8. Evaluation of Purpose-tied Aid Programmes 33 Appendix 1: Methodology of Evaluating Research Organisations in Governmental Departments 35 Introduction 35 1. Evaluation in Segment of Governmental Departments - General Considerations 35 1.1 Purpose of Evaluation 35 1.2. Evaluation Principles 35 ■ Wl RESEARCH, DEVELOPMENT AND INNOVATION COUNCIL Government of the Czech Republic 2. Evaluation Milestones 36 2.1. RO Evaluation by LCDRO Conducted by Provider 36 2.2 Procedure in RO Evaluation by LCDRO Conducted by Provider 37 3. Input Evaluation for 2018-2022 Conducted in 2017 38 3.1 Request for RO to Submit Documents with Defined Particulars and Maximum Amount of Aid 38 3.2 Submitting Documents with All Data Required 4-0 3.3 Provider Evaluating Completeness of the Subsidy Application and Data (Evaluation Phase 1) 4-0 3.4 Provider Evaluating the Meeting of the Criteria for the Granting of Subsidy (Evaluation Phase 2) 4-0 3.5 Provider Evaluating LCDRO by Peer Review Evaluation through Expert Advisory Body (Evaluation Phase 3) 4-0 3.6 Issue and Publish Decision to Grant LCDRO Institutional Aid 41 4. Continuous Evaluation for 2018-2021 Conducted in 2019-2022 41 4.1 Progress Report 41 4.2 Changes during the Year 41 5. Final Evaluation for 2018-2022 Conducted in 2023 42 5.1 Final Report 42 5.2 Final Evaluation 42 Appendix 2: Conflict of Interests as Treated by CAS and IPn Methodology 43 1. Conflict of Interests as Treated in CAS Evaluation 43 2. Conflict of Interests as Treated in IPn Methodology 43 Appendix 3: Institutional LCDRO Expenditures in 2017-2019 45 ■ SUMMARY The purpose of evaluating the research, development and innovations ("R&D&I") system through the Methodology for Evaluating Research Organisations and Research, Development and Innovations Purpose-tied Support Programmes ("M17+") is to: • Collect information for quality management of R&D&I at all levels (the formative aspects); • Enhance the efficiency of spending public funds (the comprehensive aspects); • Support the quality and international competitiveness of Czech R&D&I; • Distribute and add to the accountability of the stakeholders in the R&D&I system; • Get information for granting subsidies for long-term conceptual development of a research organisation ("LCDRO"). Taking account of the different missions of research organisations ("RO") in the system of research, the evaluation scheme evaluates the outputs, impacts and overall development outlook of a RO; takes account of departmental specifics; uses informed and independent peer views in the evaluation process; evaluates ROs in both the national and international contexts; and provides information for allocating those public funds reserved for institutional development of ROs. The evaluation scheme is based on the experience from the last evaluation of the research institutes of the Czech Academy of Sciences ("CAS"); the evaluation of departmental ROs; the R&D&I evaluation according to the Methodologies for Evaluating Results of Research Organisations and Results ofCompleted Programmes for 2013-2016 (the "2013-2016 Methodology"); the project Effective System of Evaluating and Funding Research, Development and Innovations (the "IPn Methodology"); and is in accordance with proven international practices. The evaluation scheme is in accordance with the Czech Republics National Research, Development and Innovations Policy for 2016-2020 (the "R&D&I NP") and the long-term RO evaluation guidelines approved by the Research, Development and Innovations Council (the "R&D&I Council"). This document distinguishes two phases of evaluation: (a) the implementation phase, which is to be conducted in 2017-2019; and (b) regular comprehensive evaluation, which is to be started by 2020. In the implementation phase, the M17+ principles will be applied progressively, in a simplified manner, and with limited consequences for the funding of each RO. The fundamental M17+ principles may be summed up as follows: Three different evaluation levels. Each management level in the system of R&D&I requires information with varying degrees of detail. M17+ distinguishes three levels of management and evaluation: (a) evaluation for the management and funding of the complete R&D&I system -the central level - the R&D&I Council / the Section of Research, Development and Innovations, the Office of the Government of the Czech Republic (the "RDI Section"); (b) evaluation at the provider level; and (c) evaluation for the management of a RO. Ml7+ particularly addresses the national level and defines the methods of collaboration with the provider level. ROs classified into three segments. As the positions and missions of ROs in the system of R&D&I are different, ROs are classified into three segments for evaluation purposes: (a) universities; (b) institutes of CAS; and (c) departmental ROs. Common framework for evaluating the quality of ROs. M17+ introduces a quality evaluation system comprised of five basic modules common for all ROs: Ml - Quality of selected results; M2 - Efficiency of research; M3 -Social relevance; M4 - Viability; and M5 - Strategy and policies. The relative importance of the modules will be different according to the position and mission of a RO in the system of R&D&I. The modules are the evaluation framework, which can be adjusted by providers and adapted to a RO's position in the system of R&D&I. Periodicity of evaluation. In the implementation phase, the annual national evaluation will cover particularly the tools of the Ml and M2 modules (bibliometric analysis, or remote reviews in those disciplines in which biblio-metrics do not provide relevant evaluation data). Full evaluation using all the five modules will have been completed by 2020. The aim for the period after 2020 is to conduct a full evaluation process every five years. ■ Wl RESEARCH, DEVELOPMENT AND INNOVATION COUNCIL Government of the Czech Republic Progress of Results Evaluation by Segment Key: H13-16 Methodology for evaluating the results of research organisations and the results of completed programmes valid for 2013-2016 M17+ Methodology for evaluating research organisations and research, development and nnovations purpose-tied support programmes Fixed 100% | Increase (illustrative) - for exact amounts please refer to Table 2. Three basic tools of evaluation. Bibliometric analysis, assessment through remote reviews and by a panel of experts will be applied in each module to evaluate ROs. Onsite visits by panels of experts will be added after the implementation phase is completed. Inclusion of ROs in the evaluation system. Only those ROs which are entered in the Register of Public Research Institutes, maintained by the Ministry of Education, Youth and Sports (the "MEYS"), can be included in the system of evaluation. Results of ROs that are entered in the Results Information Register (the "RIR") are those that will be evaluated. Expert panels. Six expert panels will be set up for expert RO evaluation by the OECD11 fields of research and development (Frascati Manual): Natural Sciences; Engineering and Technology; Medical and Health Sciences; Agricul- 1 OECD Fields of Research and Development (FRASCATI Manual 2015), corroboratively WoS Categories, or Fields and Subfields. See: Základní principy Hodnocení výzkumné a odborné činnosti pracovišťAV ČR za léta 2010-2014. [Basic Principle of Evaluation at the Czech Academy of Sciences in 2010-20U] Praha: Akademie věd ČR, 2015. tural and Veterinary Sciences; Social Sciences; and Humanities and the Arts. Experts in applied and industrial research and experts from practice will also sit on these panels. Where appropriate and expedient, expert panels will be particularly made up of independent foreign experts. Panels will use remote reviews to evaluate selected results, usually with the involvement of foreign evaluators where appropriate and expedient. Expert panel evaluation will result in a proposal to include each result under one of the five qualitative ratings, the reasoning for the given rating, and a summary report for the given panel. Scaling of ROs. Complete evaluation in all modules in a five-year cycle will result in putting the RO on a four -degree scale. This will be done after discussions between the provider22, the RDI Council/RDI Section and panel (deputy) heads; additional experts may be invited. Evaluation results are subject to approval by the RDI Council. The RDI Council prepares a report based on the results of the comprehensive evaluation of a RO, and this report will be discussed with the RO before publication. 2 If the provider is different from the promoter, representatives of the promoter are also invited. u The scaling of RO results will be indicative in the implementation period. The first scaling valid for a long term will be conducted in 2019. Implementation period. Transition to the new methodology will be gradual, and the implementation period covers the years 2017-2019. The key principle of the evaluation system in the implementation period is to keep the evaluation load as low as possible while still having a legitimate and justifiable process, and prepare and implement new tools for robust, internationally comparable evaluation of national R&D&I. This period is also necessary in order to create conditions for organising the complete evaluation on part of both the state administration and the ROs subject to evaluation. A combination of the tools in preparation for the Ml and M2 modules and results evaluation by national experts will be the basis for national evaluation in 2017 and 2018. Foreign evaluators will be involved through remote reviews in defined cases as described above. The results obtained in the past year, i.e. 2016 or 2017, will always be subject to evaluation. The year 2019 will be the first year for ROs to be evaluated using the full Ml module and the bibliometric analysis under M2, and evaluation panels will be international. The results for 2014-2018 will be subject to evaluation using the evaluation results for 2017 and 2018, and scaling ratings will be attributed to ROs based on quality. Transition to regular five-year evaluations. Providers and the individual parts of the system differ from each other in their preparedness for full evaluation. This fact is reflected in the implementation of M17+, and is also emphasised in the relevant passages of the text. Principles of funding LCDRO. The funds for the LCDRO will be split into two segments: (a) stabilisation segment (the base); and (b) motivation segment (increment). In the implementation period, the base will account for 100% of the LCDRO distributed using the 2013-2016 Methodology. The motivation segment of funding, no less than the year-on-year increase in the LCDRO, will be distributed using the evaluation results. Evaluation will produce a distribution of research organisations into four groups: A, B, C, and D. Based on Government Resolution No. 477 of 30 May 2016, the LCDRO allocated to the provider level will be increased (approximately 4.5%, 6% and 10% in 2017, 2018 and 2019, respectively), as expected in a mid-term outlook. The relation between evaluation and the LCDRO is clearly demonstrated in the following table. INTRODUCTION In its formative function, the evaluation of research organisations is the critical strategic tool necessary for the effective management of the R&D&I system at all levels. The knowledge gained in evaluation is the basis for the strategic documents of the national science policy, proposals for research priorities and national programmes, proposals for reforms in the system of R&D&I, and a reorganisation of R&D&I institutions if need be. The importance of evaluation has been growing globally, as a result of growing accountability for spending public money expediently and economically, and growing social pressure on the social justifiability of research. The importance of RO evaluation has also been growing as a result of the limited nature of the funds available. This regulation fulfils the duty of the RDI Council defined in Act No. 130/2002 Sb., on public funding of rese- arch, experimental development and innovations, and amendments to some related acts (the Research, Experimental Development and Innovation Aid Act), as amended (the "Act"). The RDI Council is to ensure that "the Methodology for evaluating results of research organisations and completed programmes be prepared and submitted to government" (section 35(2)(c) of the Act) and "the results of research organisations and completed programmes be evaluated using the Methodology for evaluating results of research organisations and completed programmes subject to approval by government" (section 35(2)(d) of the Act). Joint methodology is in accordance with the duties and needs of providers, takes account of their current practices, and allows the central management level of the 10 Wl RESEARCH, DEVELOPMENT AND INNOVATION COUNCIL Government of the Czech Republic R&D&I system to collect relevant information for making qualified decisions. M17+ is a framework document and will be supplemented with a system of additional hierarchically subordinated documents. M17+ is prepared in detail for national level and the implementation period, leading to full evaluation in all segments of R&D&I (separately and by phases). Any partial procedures, rules, charters and rules of procedure not specified in this document will be supplied gradually during the implementation period and in collaboration with the relevant stakeholders. Putting these partial documents into practice is subject to approval by the RDI Council. For all of the segments, M17+ is the minimum common framework that must be observed in segment evaluation. Individual ROs or the ROs in a single government department must be evaluated using a procedure that corresponds to the institutions evaluated - their roles and missions in the system of R&D&I (the subsidiarity principle). MAIN GOALS, PURPOSE AND BACKGROUND OF EVALUATION The goals of evaluation are to: (a) collect information for quality management of the system of R&D&I at all levels (the formative aspects); (b) enhance the efficiency of spending public funds (the comprehensive aspects); and (c) facilitate better quality and international competitiveness of Czech R&D&I. Therefore, the primary purpose is to obtain data for making decisions about the granting of institutional aid to the LCD-RO in accordance with valid legislation, and obtain information for managing R&D&I in the Czech Republic, and the information necessary for providers to meet their roles and RO management to manage their ROs over a long term. The proposed procedure corresponds to the basic strategic documents currently valid for R&D&I: R&D&I NP; the National Reform Programme of the Czech Republic for 2016; the policy statement of the government; Government Resolutions No. 1066 and 1067 of 21 December 2015; and the National Priorities of Oriented Research, Experimental Development and Innovations, and is in accordance with the long-term research organisation evaluation principles adopted by the RDI Council at the Council's 263rd meeting in March 2011. The target conditions as defined by Measure 10 of the R&D&I NP are as follows: IPn Methodology and the experience from CAS research institutes, introduce such a system of research organisation evaluation that takes account of the differences between research organisations by their role and mission in the system of R&D&I and motivates research organisations to raise the standard of their research, get involved in international research, conduct research that can be utilised in applications, and develop collaboration with the application sphere. Consequently, evaluation will include criteria that take account of the various aspects of research, such as research environment, international and national collaboration, research excellence, research performance, social relevance of research and impacts of research. Evaluation (including the ties to the distribution of institutional aid by research performance) will also stimulate ROs to improve strategic organisational management, develop international collaboration and establish relations with the application sphere. Accountability: RDI Section and RDI Council; joint competence: MEYS; collaborating bodies: other administrative authorities in charge of research and development, within their competence." "Introduce such a system of research organisation evaluation that encourages raising the standard of research: In connection with the outputs from the Taking account of the different missions of the ROs in the system of research, the evaluation scheme evaluates the outputs, the impacts and the institutional and overall de- 11 velopment outlook of ROs; takes account of departmental specifics; uses informed peer views in the evaluation process; and provides information for allocating those public funds reserved for the institutional development of ROs. Evaluation is based on the experience from the last evaluation of CAS research institutes, the evaluation of departmental ROs, the evaluation of R&D&I using the 2013-2016 Methodology, and the IPn Methodology project. M17+ uses joint and unifying features and is divided into three tiers by the level of control (government, providers, and ROs). The different levels of management in the system of R&D&I imply different needs addressed by evaluation in respect of the focus, inputs, shape, and degree of detail of the required evaluation outputs. Therefore, the following system levels of evaluation are recognised: • I. Evaluation for managing and funding the en- tire system: the central authority - RDI Coun-cil/RDI Section; • II. Evaluation at the provider level; • III. Evaluation for managing ROs. Figure 1: R&D&I Levels of Management Management of the R&D&I system Allocation of funds for budgetary chapters Central Authority Providers Providers Management of the R&D&I in scope Providing the needs of the resort Financing of research organisations Research Organisations Research Organisations Research institution management The purpose of evaluation is differentiated by the level of accountability and management in the system of R&D&I. The responsibilities and purview of the RDI Section and the RDI Council differ from those of individual providers, the fact of which results from the difference between the role played by the promoter of research organisations under their purview and the role played by providers (and providers of institutional aid, in particular). The management of each RO is the third tier, utterly different in terms of the degree of detail of management information. • The task of tier I (referred to as the "central authority" in this document) is to control and coordinate research, development and innovations at the central government level and submit to the government proposals for state budget R&D&I allocations in individual budgetary chapters, rather than evaluate and decide the funding of each RO, which is the sovereign competence (both power and accountability) of each provider. • The task of tier II (referred to as "providers") is to ensure R&D&I management and funding within its competence, and fund and manage ROs. • The task of tier III (referred to as "research organisations") is to secure formative evaluation in the degree of detail as necessary for the managerial level, and ensure the evaluation obtained in the previous tiers can be used. The roles of the stakeholders are complementary to each other and not interchangeable. Each level of management and funding of R&D&I requires a different degree of information detail and, to some extent, uses different sources of information, often even different information. The simple idea that mere evaluation of all (= the sum of) results of ROs would give a complete picture of the entire R&D&I sphere in the Czech Republic and be sufficient for managing and funding R&D&I is just not true. The weakness of the existing system is that the system only works retrospectively, and does not provide for full strategic aid to new developing disciplines. Ml7+ uses a common framework to evaluate ROs that facilitates the comparability of evaluation results across the system of R&D&I. RO evaluation uses several evaluation criteria that have a common base but also that correspond to the practices applied in the individual groups of ROs. Evaluation is based on the assessment of the quality of selected results, overall performance of ROs, the social relevance of given research, the research environment 12 Wl RESEARCH, DEVELOPMENT AND INNOVATION COUNCIL Government of the Czech Republic in each RO, the ROs' strategy and policies, development potential, and their position in the national and international research community. It is expressly pointed out that the content of these general terms and their significance in evaluation can differ for each group of ROs because each RO has a different role in the system and evaluation must always be related to the concrete mission of an institution. Evaluation according to the criteria defined is conducted using modules. Competence for the implementation of modules results from Act No. 130/2002 Sb., on public funding of research, experimental development and innovations and amendments to some related acts (the Research, Experimental Development and Innovation Aid Act), as amended, and Act No. 218/2000 Sb., on budgetary rules and amendments to some related acts (the Budgetary Rules), as amended, and respects the purpose of evaluation. For evaluation purposes, ROs are divided into three basic groups3 based on their position in the system of R&D&I and the purpose of the establishment: research institutes of CAS; universities; and departmental ROs.4 This division also takes account of their readiness and the readiness of providers to implement full M17+, and is reflected in the overall evaluation schedule, which is so structured as to ensure that full evaluation in each segment is technically feasible. In accordance with section 5a(2)(b) of Act No. 130/2002 Sb., evaluation is a source of data for the RDI Council to prepare a draft state budget for R&D&I. Given the different focus and purpose of the ROs under the competence of each provider, the evaluation cannot be directly reflected in the draft state budget for research, development and innovations (the "R&D&I SB"). Evaluation provides a source of data for decisions concerning effective granting of institutional aid to the LCD-RO and will be a tool to encourage improvements in the operations of ROs. The system of RO evaluation will be constructed gradually over the following four years, with each year adding to the comprehensiveness of the system. Full-range evaluation is expected to be implemented by 2020, and then conducted every five years. In the implementation period, national evaluation will be conducted annually, in accordance with Act No. 130/2002 Coll. The key principle in the evaluation system in the implementation period is to keep the evaluation load as low as possible while still having a legitimate and justifiable process, and prepare and implement new tools for robust, internationally comparable and informative evaluation of national R&D&I. For that reason this document describes the system of evaluation in the implementation period (particularly in 2017-2018) separately. The RDI Council/RDI Section will ensure evaluation at the national level of the R&D&I system management. At the provider level, evaluation will be conducted by collaboration between the provider/promoter and the RDI Council/RDI Section. 3 A RO's group classification is determined by the provider of institutional aid to the LCDRO. 4 Universities provide the activities listed in section 7fa] and lb} of Act No. 11111998 Sb., on universities and amendments to other acts, as amended. The evaluation of university research should reflect these activities, such as the participation of students, PhD students, and postdoctoral researchers in the research conducted}. CAS and its research institutes are founded primarily for conducting "scientific research (section 13(a} and lb} of Act No. 283/1992 Sb., on the Czech Academy of Sciences, as amended". Departmental ROs and private research institutions primarily provide applied research and development in the various governmental departments and, as may be the case, are the research and knowledge base for their given departments. 13 1. BASIC EVALUATION MODULES Evaluation will be conducted in five basic modules, which together will ensure the implementation of the strategic goals of the evaluation and funding system. The main modules are: Quality of selected results; Research performance; Social relevance; Viability; and Strategy and policies. These evaluation modules are relevant for all types of RO, irrespective of their discipline or type of research. However, the importance and extent of the modules will differ according to the position and mission of a RO in the national system of R&D&I. The modules are the underlying structure of evaluation, which, at the provider level, may be completed with additional indicators that would assess the specific features of different types of RO in more detail. Evaluation in each of the five modules will use the following basic tools to a differing extent: bibliometric analysis (tool 1), and remote reviews (tool 2). Onsite visits by panels of experts (tool 3) will be added after the implementation phase is completed. 1.1 MODULE 1 - Quality of Selected Results This module is to motivate ROs to deliver research of a quality standard in international comparison. The module is also to encourage research with a high potential of practical application. The evaluation principle is to have the selected results assessed by an expert panel in terms of quality, originality, and significance against international standards. A limited number of selected results are evaluated and assessed in two different categories. The key assessment criterion in the first category is the contribution to the knowledge in the given discipline. The key assessment criterion in the second category is social relevance, or significance for society, and, where appropriate, the impacts (economic or otherwise describable benefit for society) of this significance. Social relevance is understood as both "utility" (typically industrial research generating economic profit) and "demand" (typically departmental research resulting from social demand). • I. First category: particularly (but not solely) for basic research results • I. Second category: particularly for applied research Only the results entered in RIR maybe included in evaluation. Each RO is to select the results for evaluation. ROs register their selected results in either category at their discretion, and a given result may only be registered once for the given institution in only one category (the same result may not be registered in both categories). ROs must also specify the field and sub-field of research for their results according to the OECD classification (Frascati Manual)5 and key words, for the purpose of assessing applied research results, plus additional specifying attributes as appropriate, such as CZ-NACE or the priority fields/sub-fields of the National Priorities of Oriented Research, Experimental Development and Innovations. The evaluation in a given year will cover the results and outputs realised in the five years prior to the year of evaluation (for instance, the results and outputs realised between 1 January 2014 and 31 December 2018 will be included in the 2019 evaluation, when full Module 1 will have been applied for the first time). The number of the results submitted derives from the size of the organisation. The "size" measure is the volume of the LCDRO-type institutional aid granted in the previous period. Also, a "minimum number of results for submission" is defined to ensure such a number of results to evaluate an institution that will provide a general view of the standard of that institution's production in the last five years (that interval is determined in valid legislation). The number of the results for submission is based on the following principles: 1. The minimum 10 submission results required per single RO are selected from the results realised in the past five years. It is recommended that results be submitted in proportions corresponding to the internal structure 5 OECD Fields of Research and Development (FRASCATI Manual 2015), corroboratively WoS Categories, or Fields and Subfields. See: Základní principy Hodnocení výzkumné a odborné činnosti pracovišť AV ČR za léta 2010-2014. [Basic Principle of Evaluation at the Czech Academy of Sciences in 2010-20UJ Praha: Akademie věd ČR, 2015. H Wl RESEARCH, DEVELOPMENT AND INNOVATION COUNCIL Government of the Czech Republic of the RO with regard to research functional units. Those ROs which have produced fewer than the minimum number of results are not excluded from evaluation if they can explain their low number of results.6 2. If, in the year for which results are submitted for evaluation, the RO is a beneficiary of LCDRO institutional aid higher than CZK 10 million, the RO must submit one extra result for each CZK 10 million (whether full or not), and is required to submit the results in proportions corresponding the RO's internal structure (with regard to research functional units). 3. The RO selects the results for remote review evaluation and is responsible for what results it discloses for evaluation. The term "research functional units" typically concerns the segments of universities and CAS. Research functional units can be research institutes, faculty-like organisational units7, or groups of faculties or institutes. For the selected results, the RO is to specify the proportionate representation of each research functional unit in the volume of the selected results, and justify these shares with regard to the ROs organisational structure, such as in connection with the internal re-distribution of the LCDRO. 1.2 MODULE 2 - Research Performance Overall research performance is a multidimensional category and includes productivity, quality and competitiveness in research and development. All these factors are necessary for ROs to operate correctly, whether they are scientific research organisations or research organisations primarily involved in applied research and development. The following indicators will be monitored: • Bibliometric data covering all results produced by a RO in each discipline for the reporting period. Sources of 6 The institutional aid amount in year N determines the number of results to be registered in year N+1. 7 A term other than "organisational unit" is used on purpose. "University organisational unit" is a term of established use in the RIR; this term also covers non-research units, such as rector's office. these data will be international databases in those disciplines in which the major part of results are published in international journals. • The Information System for Research, Experimental Development and Innovations (the "ISREDI") will be the main data source for the disciplines in which results are usually books or articles published elsewhere than international databases, and for the results of applied research and development. • In a range of disciplines, the results for evaluation are books or articles with many co-authors. This problem will not be addressed by determining mechanically the mathematical share of each author (RO) in the result. • Volume and structure of the R&D&I funds obtained. • Number and structure of employees. • Additional quantitative analyses prepared using regular statistical methods of descriptive statistics. • Quantitative indicators and analyses for applied research.8 1.3 MODULE 3 - Social Relevance Module 3 is particularly important for the ROs conducting applied R&D&I and directly serving users, such as industries, the public sector, or other ROs. The rate of positive impacts of R&D&I and their results on society and communities will also be evaluated in this module. The social relevance criterion will be applied to applied research results, which are of immediate importance to economy, state and public administration, and culture policies. This module will also include the evaluation of basic research results, which affect individuals and society indirectly (indirect impacts). The aspects that need to be taken account of for these points are: the relevance and current needs for research focus; the methods proposed and applied; and the social significance of a particular research project as a whole. This module is based on assessing the parameters that monitor particularly the following: transfer of results into practice; collaboration with the application sphere; activities for transferring knowledge and technologies to non -academic entities; impacts on the quality of life of individuals and society; and economic benefits, welfare benefits and benefits for building national and cultural identity. 8 As per RAE/REF, for instance. 15 Additional parameters include: involvement of students in research; optional lectures/seminars related to the research of the given RO; practical training of students; quality of education and participation of doctorands; international and national renowned awards for research excellence; mobility of researchers between ROs and the industries and services sector, or the users of research results; RO significance to regional development; and popularisation and feedback. The detailing of the method of applied research evaluation will continue at the level of full evaluation for the level of providers/promoters, which will detail their own evaluation methodologies in accordance with Ml 7+ and the relevant schedule as described therein. This gives room for including a range of additional criteria in the modules (Module 3 in particular) used in the full five-year evaluations, which have key impacts on RO scaling. 1.4 MODULE k - Viability Module 4 will assess the quality of management and internal processes of ROs in these aspects: Research environment - organisational chart, the quality of research management, HR policies, HR structure and development, and research infrastructure facilities and organisation. International and national collaboration - membership in the global and national research communities, community activities. External funding - international and national cooperation and presentation of research and collaboration, student or young researcher fellowships abroad, prestige of research, participation in the activities of the expert community, success in obtaining projects and co-funding (third-party funding). Grant projects completed with success, including final evaluation and the option to request a review report. Position of a RO according to international indicators and statistics. Basic structure of costs and revenues in each year of the reporting period - all the grant and programme projects receiving public national or European funds or funds from other foreign sources in the reporting period where the research site is the beneficent or a co-beneficent; contracted research; collaborative research and the trans- fer of technologies; external funding (either purpose-tied or contracted); licence revenues; spin-offs; and revenues from the sale of patents and licence agreements. Evaluation tools: • Statistical data and indicators at national and international levels; • List of all the grant and programme projects receiving public national or European funds or funds from other foreign sources in the reporting period where the research site is the beneficent or a co-beneficent; • Self-evaluation reports, annual reports and other similar documents specified for the given segment; • International awards won by the RO under evaluation; • Onsite visits by expert panels (tools 3), particularly in the segments Universities and CAS. 1.5 MODULE 5 - Strategy and Policies Good research strategy of a RO defines the basis for future development, and the quality of the strategy is a critical factor for expert panels. This criterion is significant for all ROs. Strategy and policies covers the monitoring of parameters in the following: research strategy reasonability and quality; organisations mission (purpose and strategic direction); policies (the steps by which the mission has been implemented); the implementation of the policies; vision for the next period; links to the implementation of the policies of the provider/promoter; links (if any) to the implementation of higher strategic goals; and the measures resulting from valid national and supranational documents. Evaluation tools: • Implementation of policies; • Reasonability and feasibility of the research strategy; • Self-evaluation report (for Universities and CAS) or the long-term development policy progress report (for Departmental ROs); • Continual checks, such as mid-term evaluations. 1.6 Definition of Results The duty to submit data on R&D&I results is regulated in section 12(1) of Act No. 130/2002 Coll. 16 Wl RESEARCH, DEVELOPMENT AND INNOVATION COUNCIL Government of the Czech Republic The definitions of types of result, the criteria for their capacity to be verified, and the method of entering data in the ISREDI will be updated in connection with M17+ to facilitate keeping national records of R&D&I. The definitions of the types of result must be updated and implemented while taking account of the continuity with the existing definitions and preventing retrospective impacts. In order to facilitate updating without the need to open the source document, these issues will be addressed in a separate document subject to governmental approval. The definitions provided in the 2013-2016 Methodology continue to apply until the new ones have received governmental approval.9 9 "Dále uvedené definice výsledků jsou platné od roku 2013 včetně. " [The results definitions listed below have been applicable since 2013] See: Úřad vlády ČR. Metodika hodnocení výsledků výzkumných organizací a hodnocení výsledků ukončených programů (platná pro léta 2013 až 20161 [Office of the Government of the Czech Republic. Methodology for Evaluation of Results of Research Organizations and Finished Programmes [valid for years 2013D2015)] approved under Government Resolution No. 475 of 19 June 2013 and No. 250 of 16 April 20U and No. 605 of29 July 2015 respectively. 2013, p. 32. 2. NATIONAL LEVEL OF EVALUATION National Ml7+ evaluation should produce: • Evaluation of the situation in R&D&I in the Czech Republic, the comparison of Czech R&D&I with foreign countries, and the risks and opportunities for Czech R&D&I plus the generation of related action; • Continual evaluation under section 7(7) of ActNo. 130/2002 Coll. using the annual evaluation of the results of all ROs. National evaluation is based on the distribution of competence of the institutions involved in managing or funding R&D&I activities (see Figure 2). National evaluation is based on joint standards and ensures comparability across R&D&I and RO segments, particularly in terms of the quality of results. R&D&I, such as a discipline of basic or applied research, or horizontal activities and aspects; • Analyses and reports made by the OECD, the EC, or any of the Czech relevant institutions; • Departmental reports and the evaluation results for the ROs promoted or funded by the said institutions; • Thematic evaluations, such as evaluation of a discipline across the segments, or evaluation in relation to national priorities or the National RIS3 Strategy. The basic questions to be answered by national evaluation concern the overall performance of R&D&I, progress in the government-approved R&D&I NP and, where appropriate, any need to correct its development or focus. 2.1 Evaluation of Situation in R&D&I This evaluation is strategically targeted evaluation focusing on gathering data which allow the government to adopt decisions about the R&D&I NP (progress, and changes if any), and proposing R&D&I SB expenditures along with mid-term outlook (see Table 2 in Appendix 3). The evaluation of the situation in R&D&I is based on: • Analyses made by the RDI Section (or in collaboration with other institutions) as reports describing all the sphere of research, development and innovations in the Czech Republic, or analyses dealing with a segment of 2.2 Annual Evaluation of Results The RDI Council/RDI Section are responsible for meeting the statutory duty 110 i.e. to ensure annual evaluations of R&D&I results. These bodies submit these outputs for the provider level evaluation. They receive from providers the outputs of providers' evaluations in order to evaluate the national situation in R&D&I. 10 Section 7[7] of Act No. 130/2002 Coll. on the support of research and development from public funds and on the amendment to some related acts [the Act on the Support of Research and Development! 17 The RDI Council approves the result of the annual evaluation that the Council is required to ensure under section 35(2)(d) of Act No. 130/2002 Coll. 2.3 Expert Oversight of Evaluation The oversight of evaluation lies with the RDI Council. The Council's tasks are to: • Oversee compliance with the evaluation principles; • Deal with debatable issues, ambiguities, and relevant queries. The RDI Council will not interfere with the evaluation bodies delivering their expert assessments. 2.4- Providing Conditions for Evaluation Qualified and competent people are a precondition for the change in the evaluation system to be a success. To provide such people, a new organisational unit within the Funding Department of the Research, Development and Innovations Council is established.211 The R&D&I Evaluation Unit will be responsible for: • Preparing policies for RO evaluation (universities, departmental research organisations, and CAS research institutes) in collaboration with the providers of institutional aid, the promoters of research organisations, and CAS; • Organising periodic evaluations of universities in collaboration with providers (the MEYS, the Ministry of Defence, and the Ministry of Interior); • Supporting ministries in conducting periodic evaluations of departmental research organisations; • Evaluating R&D&I centrally while taking account of the current R&D&I NP; • Conducting specialised evaluations of various R&D&I aspects; • Conducting annual evaluation of all ROs using the selected Module 1 and Module 2 indicators. 11 At first, the R&D&I Evaluation Unit will comprise a total of five new service posts, including: one "governmental principal" service post for the head of the unit, one service post for a civil servant in charge of strategies, and one "governmental assistant principal" service position. The R&D&I Evaluation Unit will collaborate with: • ISREDI Unit in using the information in the ISREDI; • Analyses and Budget Unit in evaluating R&D&I centrally and using the evaluation results for preparing analyses and proposing R&D&I SB expenditures. • The R&D&I Evaluation Unit will also prepare documents for meetings with providers/promoters where evaluation results are discussed. 2.5 Technical Supplies, IT Support An application linked to relevant code-lists in the ISREDI will be set up for collecting the results registered for evaluation, in order to reduce the administrative burden on ROs. However, adding attachments, annotations and comments in ISREDI 2.0 cannot be an integral part of the RIR until the necessary changes are made to legislation and in order to ensure cybernetic security for ISREDI 2.0 as a major information systeml11. Also, there will be additional applications for processing data for panel discussions and providing online access for expert panel members and evaluators. 2.6 Evaluation Tools and Expert Panels The following will be the basic tools of evaluation: • Tool 1 - bibliometric analysis • Tool 2 - remote reviews Two basic tools will be used for evaluating selected results: either bibliometric analysis or remote reviews by external evaluators. The organisation registering a result for evaluation is to suggest the suitable tool. Where appropriate, the expert panel may revise the suggestion (for example, because of suspicion of fraudulent, "predatory", journals13). The new method of evaluation places greater emphasis on the evaluation of applied research results, for which biblio- 12 See also Government Decree No. 397/2009 Coll. 13 Generally speaking, a situation when an expert panel is of the opinion that the bibliometric data associated with a given result are a consequence of fraudulent practices. 18 Wl RESEARCH, DEVELOPMENT AND INNOVATION COUNCIL Government of the Czech Republic Figure 2: General National Evaluation Chart Annual National Evaluation (Situation after implementation in each R&D&I segment) EVALUATION OF SITUATION IN R&D&I ■ I mMPLEME^T^ONl RDI Council/RDI Section EVALUATION OF ROs' RESULTS pursuant to Act No. 130/2002 Coll. 4[ 4^ DISCUSSIONS Provider RDI Council/RDI Section EXPERTS Monitoring A, B, C, D Full Five-year Evaluation [ QUALITY OF SELECTED RESULTS RESEARCH PERFORMANCE SOCIAL A RELEVANCE y GOVERNMENTAL DEPARTMENTS VIABILITY ] STRATEGY AND POLICIES Governmental Departments in collaboration with RDI Council/RDI Section 1 ľ DISCUSSIONS Provider RDI Council/RDI Section EXPERTS i 1 BBBBffWM k ^^P^W^l ^llltM^Dj^r metrics are not a particularly appropriate tool and which, like the results in a considerable part of SOCIAL SCIENCES and HUMANITIES AND THE ARTS (SSHA)14, require expert review assessment. U OECD 12015], Frascati Manual 2015: Guidelines for Collecting and Reporting Data on Research and Experimental Development, Classification and distribution by Fields of Research and Development (FORD}, OECD Publishing, Paris. Available from: http://dx.doi.org/10.1787/9789264239012-en 19 Six expert panels, representing the six fields of research and development (Figure 3), will be set up to evaluate the quality of selected results. The fields of research and development comprise those defined by OECD (Frascati Manual).15 Experts in applied and industrial research and experts from practice will also sit on these panels. Bibliometric Analysis (TOOL 1) The results published in journals indexed in internationally recognised citation databases16 will be evaluated through internationally recognised bibliometric approaches. This will produce a structured set of bibliometric indicators17 with information on each output evaluated - including the bibliometric data obtained by international comparison - that will facilitate further aggregation of data: by RO, RO organisational unit, field of research, subject of research, etc. The RDI Council/RDI Section will prepare the underlying data. Panels will add their comments on these underlying data. Remote Reviews (TOOL 2) The expert panel will choose remote reviewers to have the results evaluated through remote review. The evaluation is to judge whether the result meets the global or national standard of quality for the given field of research, rank the result on a scale between 1 and 5 and provide brief reasoning. An expert panel is a group of experts coordinating the reviews of the research outputs in the field of research corresponding to their expertise. A panel is directed by a head and a deputy head. Expert panel members distribute results for review assessment to external evalua-tors and decide debatable cases. They also prepare recapitulative expert comments on the results evaluated by Tool 1 (bibliometric analysis) in their field of expertise. 15 OECD Fields of Research and Development (FRASCATI Manual 2015), corroboratively WoS Categories, or Fields and Subfields. See: Základní principy Hodnocení výzkumné a odborné činnosti pracovišť AV ČR za léta 2010-2014. [Basic Principle of Evaluation at the Czech Academy of Sciences in 2010-20UJ Praha: Akademie věd ČR, 2015. 16 According to current definitions applicable: results types -Lp, Jsc, D. 17 For instance, ranking according to AIS (Article Influence Score, Web of Science) or SJR (Scimago Journal Rank, Scopus). The panel for each field of research has at least as many members as the subjects included in the panel's field of research. Each panel is chaired by a head. If the head is an internationally renowned foreign expert, a respected home expert should be the deputy. The composition of expert panels will be different for the implementation period covering 2017 and 2018, and the following period, i.e. that starting in 2019 (see Chapter 4). Providers/promotes, ROs and other stakeholders will be called to nominate persons for expert panels and the reviewer database (experts for the fields of research in which they are involved). The evaluator/reviewer database will include as many foreign experts as possible. Existing databases may be used as templates for the evaluator database -while specifying the relevant attributes to facilitate the classification of results. The evaluator database will be created also with regard to assessing applied research results by including experts in applied research. The possibility of collaborating with renowned international research societies and institutions will be considered when creating the database and addressing experts for expert panels. The nomination procedure will be similar to that for nominating experts on expert panels. Evaluators/reviewers are experts who assess the outputs submitted for remote review evaluation. These experts are registered in a database, which includes data on their specific subjects, sub-subjects and specialisation, and other information as may be required (for instance, with regard to assessing applied research results, the subject and topics are recorded using key words or the CZ-NACE classification). Evaluators are not members of expert panels. Evaluators maybe included in the database from those expert panels that perform evaluation using the existing 2013-2016 Methodology, the evaluators from other functional and proven databases, and experts recommended the RDI Council, with emphasis placed on a significant share of evaluators from other countries. The evaluators in the database can have a status of non-remunerated expert, and will not be contacted until a result matching their expertise is registered for evaluation. 20 Wl RESEARCH, DEVELOPMENT AND INNOVATION COUNCIL Government of the Czech Republic Figure 3: Organisational Chart of Expert Panels18 Natural Sciences Chairperson Engineering & Technology Chairperson 1.1 Mathematics 1.2 Computer & information Sei. 1.3 Physical Sci. 1.4 Chemical Sci. 1.5 Earth & Related Enviri. Sci. 1 .ó Biological Sci. 1.7 Other Natural Sci. 3.1 Basic Med. 3.2 Clinical Med. 3.3 Health Sci. 2.1 Civil Engineering 2 3.4 Medical Biotechnology ^_ 3.5 Other Medical Sci. 4.1 Agriculture Forestry and Fisheries 4.2 Animal and Dairy Sci. 4.3 Veterinary Sci. 4.4 Agricultural Biotechnology. 4.5 Other Agricultural Sci. 5.1 Psychology & Cognitive Sci. 5.2 Economics & Business 5.3 Education 5.4 Sociology 5.5 Law 5.6 Political Sci. 5.7 Social & Economical Geography 5.8 Media & communications 5.9 Other Social Sci. Agricultural & Veterinary Sciences Chairperson 5. Social Sciences Chairperson 6.1 History & Archaeology 6. Humanities & The Arts Chairperson 6.2 Languages & Literature 6.3 Philosophy, Ethics & Religion 6.4 Arts 6.5 Other Humanities OECD Fields of Research and Development (FRASCATI Manual 2015), corroboratively WoS Categories, or Fields and Subfields. See: Základní principy Hodnocení výzkumné a odborné činnosti pracovišť AV ČR za léta 2010-2014. [Basic Principle of Evaluation at the Czech Academy of Sciences in 2010-2014.] Praha: Akademie věd ČR, 2015. 2.7 Evaluation Procedure and its Formal Aspects An expert panel is led by a head, who coordinates and monitors the work of panel members and evaluators but does not evaluate any output. The panel head is responsible for harmonising the levels of suggested evaluators across the subjects of research in order to ensure their levels of expertise are comparable. Each output submitted for review evaluation will be assessed by two evaluators. The evaluator reviews the result, scores it on the quality standard scale between 1 and 5, and provides reasoning for the score using the characteristics given below. 21 Qualitative Scale for Category I Evaluation criterion -contribution to knowledge (for basic research results, in particular):19 (1) Results that are world-leading in terms of originality, significance and efforts required to obtain the results.20 (2) Results that are internationally excellent but not top level in terms of originality, significance and efforts required to obtain the results. (3) Results that are recognised internationally in terms of originality, significance and efforts required to obtain the results. (4) Results that are recognised nationally in terms of originality, significance and efforts required to obtain the results. (5) Results that fail to meet the standard for being recognised nationally.21 Qualitative Scale for Category II Evaluation criterion -social relevance (for applied research in particular): 20 21 The scale and the related comments adopted from OECD Fields of Research and Development (FRASCATI manual 2015), corroboratively WoS Categories, or Fields and Subfields. See: Základní principy Hodnocení výzkumné a odborné činnosti pracovišť AV ČR za léta 2010-2014. [Basic Principle of Evaluation at the Czech Academy of Sciences in 2010-2014.} Praha: Akademie věd ČR, 2015. The "world-leading standard" is the absolute top quality in any subject or sub-subject of research and development. The terms "world-leading standard", "recognised internationally" and "recognised nationally" refer, in this context, to standards of quality They do not refer to the nature or the territory of the originator of the result, or the place where research is conducted, or the territory where it is spread. For instance, research into a topic specific to the Czech Republic may meet the "world-leading standard". On the other hand, research with international focus may not meet the "world-leading", "excellent internationally", or "recognised internationally" standards. (1) World-leading results, the practical utilisation of which will bring about a critical change with international economic impact (real likeliness to have broad application on multiple international markets, etc.) or a change with extraordinary international impact on society (real likeliness to have critical international application in spheres of public interest). (2) Excellent results, the practical utilisation of which will bring about a change with international economic impact (real likeliness to have application on multiple international markets, etc.) or a change with significant impact on society (real likeliness to have critical application in spheres of public interest). (3) Very good results, the practical utilisation of which will bring about a change with economic impact in the Czech market or a change with impact on society (real likeliness to have application in spheres of public interest). (4) Average results, the practical application of which will bring about a partial change with economic impact in the Czech market or a partial change with impact on Czech society (real likeliness to have partial application in spheres of public interest). (5) Below-average results, the practical application of which is likely to bring about no change with economic impact or change with impact on Czech society (no real likeliness to have application in spheres of public interest). If two evaluators differ in their evaluation by one qualitative grade, such as 1 versus 2, or 2 versus 3, the panel member with expertise in the result's subject of research will decide which of the two grades will apply. If two evaluators differ in their evaluation by more than one qualitative grade, such as 2 versus 4, or 1 versus 3, the panel member with expertise in the result's subject of research will ask a third evaluator to submit his evaluation. All Figure Quality Evaluation of Selected Results RO Results f Reviewer 1 PANEL -► I Reviewer 2 V Bibliometrics Ra PANEL Result evaluation by reviewers / through bibliometrics 22 Wl RESEARCH, DEVELOPMENT AND INNOVATION COUNCIL Government of the Czech Republic Figure 5: Recapitulative Report Preparation Chart Evaluated result of RO THE RDI SECTION Evaluated result of RO Evaluated result of RO Recapitulative Provider-level Report three evaluations will be used, or if a third evaluation cannot be obtained, the result's classification will be decided by the panel member with expertise in the result's subject of research. If two evaluations required for a specific result cannot be obtained (i.e. none or just one evaluation is obtained), the result's classification will be decided by the panel member with expertise in the result's subject of research. After the bibliometric indicators for each result evaluated with Tool 1 (bibliometrics) are processed, the data (i.e. including the evaluations of each result evaluated with Tool 2) will be aggregated to subjects of research, and submitted to panels for analysis, assessment and expert comment. 2.8 Report for Research Organisation and Provider The RDI Section will prepare a short structured report for the given RO using the panel's assessment of and comment on the results. This report must give a list of the number of results by rating, including reasoning, the breakdown of results by subject of research, and a summary of the bibliometric indicators, along with a recapitulative comment from the panel. Using the reports for ROs, the RDI Section will prepare a short recapitulative report for the provider level, which will include the reports for each RO. This report will be the source document for discussions with providers. 2.9 Bias The third sentence of section 21 (1) (3) of Act No. 130/2002 Coll. will be applied mutatis mutandis to judge any bias: "No commission member may be biased towards any participant or the subject-matter of public competition in research, development and innovations; in particular they may not take any part in project preparation, have any personal stake in a decision granting aid to a specific project or have any personal, work-related or other ties to participants." More detailed specification will be provided in the Charters and Rules of Procedure of each evaluation body in accordance with the standards binding for the evaluation of the Czech Academy of Sciences and the IPn Methodology standards (see Appendix 2). 23 3. IMPLEMENTATION PERIOD The new evaluation methodology meeting international standards will be phased in over 2017-2019 (the implementation period), and this transition period will cover, at the national level, the tools in Modules 1 and 2. Providers and the individual parts of the system differ from each other in their preparedness for full evaluation. This fact is reflected in the implementation of Ml 7+ and is also emphasised in the relevant passages of the text. CAS has already completed full evaluation, and awaits the next full evaluation to be conducted in 2020. Full evaluation in the departmental segment will be conducted in 2017 and 2018. Full evaluation in the universities segment will start to be phased in during 2017 and be completed by 2020. Table 1: M17+ Implementation Competence 2017 2018 2019 2020 Results that can be captured by bibliometrics (According to current definitions in force: types of result for evaluation Jimp' Jsc, D., i.e. collections or proceedings registered in Scopus or WoS.j All results • Bibliometnc analysis assessed by home panelsl [by M2) (Home panel is an expert panel predominated by home experts.} See 2017 Selected results • Foreign panels + remote reviews • Evaluation criterion contribution to knowledge [by M1) All results • Bibliometnc analysis assessed by foreign panels [by M2) (Foreign panel is an expert panel with a major share of foreign experts.} See 2018 RDI Council/ Section Results that cannot be captured Selected results evaluated by the social relevance criterion • Home panels + remote reviews [home evaluators) • Selection key - a percentage of the total volume of the registered results of the given type See 2017 Selected results • Foreign/home panels + remote reviews [foreign/home evaluators) • Evaluation criterion: social relevance [by M1) See 2018 by bibliometrics (According to current definitions applicable: types of result except for types of result JimP, Jsc, D.j Selected results evaluated by the contribution to knowledge criterion • Selection key - a percentage of the total volume of the registered results of the given type All results • The Council to submit to the government by 30 June 2017 a supplement to M1 7+ addressing the verification/registration of SSHA outputs Selected results • Home panels + remote reviews [home and foreign evaluators) • Evaluation criterion contribution to knowledge All results According to the supplement to M17+ Selected results • Foreign panels + remote reviews • Evaluation criterion contribution to knowledge [by M1) All results According to the supplement to M17+ See 2018 Universities M2 [the rest) M3-M5 M3-M5 CAS Clustering of institutes by full 2016 evaluation - - - Governmental Departments M2-M5 [some departments) M2-M5 [the rest of the departments) - - 2U Wl RESEARCH, DEVELOPMENT AND INNOVATION COUNCIL Government of the Czech Republic Year 2017 The quality evaluation of the RO results realised in 2016 will be the basis for national evaluation in 2017. Expert panels of national experts will be set up for each field of research and development according to the OECD structure (the Frasca-ti Manual). Using international databases and bibliometric tools (as per Module 2), a bibliometric analysis of all journal outputs will be prepared for all RIR results for which bibliometrics are the appropriate tool of evaluation. A qualitative results profile22 for each RO will be created, and these results will be distributed into quartiles by the subject-related AIS23 (Article Influence Score, Web of Science) of the relevant subject-related periodical. The results in the first decile of the given subject by AIS will be captured separately. An analogous approach will be taken for the results indexed in the Scopus database by SJR (Scimago Journal Rank). The bibliometric profiles will also express the share of a given RO in all top results (by AIS or other quality indicator appropriate to the given subject of research), in the given subject of research in the Czech Republic, and/or the given segment (universities, CAS) as may be the case. This multi-dimensional bibliometric report will be checked by the expert panel, who will correct the errors of blind bibliometrics. Using the bibliometric data, the expert panel will assess the quantity and the quality of published outputs, and the panel's report will be guidance for providers and ROs in their internal quality assurance processes. This method will be applied to evaluate annually all results that can be captured by bibliometrics.24 Where appropriate and feasible, statistics will be developed in selected subjects of research and for selected institutions, providing a benchmark, similarly prepared 22 As the profile will be of a comparative nature, it will be possible to judge, by the given comparative criteria, the standing of the given institution (whether it is better, worse or comparable in the relevant context}. 23 Current bibliometrics prefer AIS, which was used for the last evaluation of CAS. However, the new evaluation methodology does not fix the (nonjuse of other bibliometric indicators as each approach has its own pros and cons. What parameters will be applied will always be published in advance. Analysis will be robust so that a change in partial indicators could not significantly affect analysis results in terms of the success rate of ROs. 24 This is the result types Jimp, JSc, D according to the current definitions applicable. bibliometric analyses of results, suitably chosen comparative foreign ROs or institutes (a prestigious European university, a selected institutes of Max Planck Society, a comparable institute of applied research, etc.). Where appropriate, the expert panel will include this comparison in their report. In the subjects of research in which bibliometrics are an imperfect or an utterly impracticable tool of evaluation, i.e. the subjects of research in which international databases only capture a small portion of the research results registered in the ISREDI, such as most SSHA subjects and mathematics, and in applied research, the selected results registered for evaluation by ROs will be evaluated. Each year, 10% of the total results not able to be captured by bibliometrics will be evaluated.25 Depending on the volume of the results26 they registered, ROs will, in the given year, choose and register results for evaluation at their discretion for Category I (the knowledge contribution criterion), or Category II (the social relevance criterion); see also Chapter 2.1. The ROs will justify their choice and substantiate it with suitable material.27 The selected social relevance results realised in 2016 will be evaluated in 2017. Results will be evaluated by the relevant national expert panel comprised of both academics, and applied research 25 For 2014, approximately 30,000 results unable to be captured by bibliometrics, i.e. results other than Jimp, Jsc, D, have been registered in the RIR. The quantity of results selected for evaluation is determined as a percentage in a manner as to ensure that the total annual results evaluated by this method are kept under 3,000, approximately, with regard to the evaluation feasibility requirements (proven by the practice up to now}. If the quantity of this type of result should unexpectedly increase or decrease, the RDI Council will change the percentage of results for evaluation in order to maintain feasibility. 26 Solely the quality of the results registered for evaluation, rather than their quantity, is considered in the evaluation. 27 According to the current definitions applicable, it can be any results other than Jimp, Jsc, D, that is, including results A, E, Wand 0 others; support documents include annotation with reasons, the publication of the result for evaluation, and others, such as selected reviews, expert opinions, economic indicators, etc. 25 experts and experts with practical experience, with the use of remote reviews. This will result in classifying the selected results by ranking. An ad hoc analysis will be prepared of the 2013-2016 Methodology impacts on the publication results that cannot be captured by bibliometrics, especially in the SSHA subjects of research. This analysis will be an auxiliary document for further evaluation of these subjects and a joint work of the RDI Section, the Results Evaluation Commission, and the recently established SSHA advisory body to the RDI Council. Year 2018 The results for 2017 and the bibliometrically inexpressible results of Category I realised in respect of the years 2016 and 2017 will be evaluated. Evaluation will be carried out in a similar way as in 2017, i.e. the number of results accepted for evaluation in 2018 will follow the number of results accepted for evaluation in 2017. Unlike in 2017, the results will also be evaluated by a home expert panel using Tool 2 - remote reviews (home/foreign evaluators), according to Criterion I -contribution to knowledge non-bibliometric results (for 2016 and 2017). Throughout the force of Act No. 130/2002 Sb., which requires annual evaluation of results of all ROs, the shape of the 2018 national evaluation will be preserved as the shape for annual evaluation of results. Approximately 55% of the results claimed in the RIR over the past year will annually be put to national evaluation.28 Year 2019 The year 2019 will be the first year for all ROs to be evaluated by full Module 1 and Tool 1 - bibliometric analysis by Module 2. Where appropriate and expedient, selected outputs from all ROs for the period 2014-2018 will be assessed by international expert panels and predominantly foreign assessors, and distributed by quality using the relevant rating scales. The number of selected results will be determined by the LCDRO while taking account of subject-related specifics and after reviewing the expe- 28 To draw a comparison, a 5% random sample is audited in the regular qualitative audit. rience with the selection key for the period 2017-2018. A five-year period is chosen because the evaluation is a foreign evaluation that will be the basis for scaling ROs. Expert panels will also have available the evaluation results from 2017 and 2018. 3.1 Principles of Funding in Implementation Period The funds for institutional funding of RO development will be divided into two components: stabilisation (base) and motivation (increase), see Figure 6. 3.2 Base Fixation Base. These funds are based on fixing 100% of the distribution of LCDRO according to the 2013-2016 Methodology approved under Government Resolution No. 475 of 19 June 2013 and Government Resolution No. 250 of 16 April 2014 and No. 605 of 29 July 2015 respectively in accordance with the government-approved draft.29 The LCDRO-like institutional funds will be fixed to the level of research organisations. This procedure is in accordance with the existing legal rules of Act No. 130/2002 Coll. Section 7(7) of the act provides that "the provider may modify the amount of aid according to a more detailed evaluation using internationally recognised methodologies that the provider must publish along with the results of the more detailed evaluation and the rules for aid modification before the aid is provided." 3.3 Distribution of Aid Increase by Evaluation Increase. Additional funds, no less than the year-on-ye-ar increase in the LCDRO, will be distributed using the evaluation results. Evaluation will produce a distribution 29 Fixing the base in this manner ensures that evaluation results for past years are reflected in the funding of ROs. 26 Wl RESEARCH, DEVELOPMENT AND INNOVATION COUNCIL Government of the Czech Republic Figure 6: LCDRO Distribution Diagram A Using 17+ Methodology Using Outputs from 2013-16 Methodology of ROs into four groups - A, B, C, and D - by the quality of research. The government approved Draft R&D&I SB expenses for 2017 with mid-term outlook for 2018 and 2019 and long-term outlook up to 2021 following the draft adopted by the RDI Council at its 315th meeting on 6 May 2016; the draft incorporates a regular increase of the LCDRO as shown in the table in Appendix 3. This actually allows the entire R&D&I system to gradually adopt and be adapted to the new system of evaluation because it secures the preservation of the existing level of funds for all. refore, increases can be redistributed by the distribution of their organisations into A, B, C, and D30 from 2018 and 2019. • The fixation amount and the method of redistributing increases for universities for 2017 and 2018 are currently being negotiated; the results from the 2015 Evaluation, and the 2016 Evaluation under the currently applicable 2013-2016 Methodology, can be used for the fixation. As more modules are phased in and the robustness and quality of evaluation increases, a more differentiated redistribution of increases can start in 2019. The fixation and the increase distribution under Ml7+ differ by segment: • CAS has conducted its own evaluation using its own methodology and, from 2017, has been able to fix 100% of the initial amounts for its ROs and redistribute increases in connection with the distribution of its organisations into A, B, C, and D. • Governmental departments expect to conduct their full evaluation in collaboration with the RDI Council/ RDI Section pursuant to M17+ in 2017 or 2018; the- 30 For details please refer to Chapter 5.7. 27 4. DISCUSSING EVALUATION RESULTS WITH PROVIDERS 4.1 Discussing Full Evaluation with Providers Full evaluation through all modules in five-year cycles results in putting ROs on the following four-degree scale: A - Excellent • Institution internationally competitive in the research parameters of global fields of research, and/or institution with a strong innovation potential and excellent applied research results, and/or institution excellently fulfilling its mission. B - Very good • Institution of stable quality with excellent results in research, sufficient innovation potential, and/or significant applied research results; R&D&I results correspond to the purpose of the institution. C - Average • Institution of unstable quality achieving prevailingly good or average results in the parameters of basic and/ or applied research, and/or institution fulfilling its purpose in an average manner. • ROs with strategies and efforts to remove weaknesses and deficiencies. D - Below average • Institution below average in the vast majority of the parameters of basic and/or applied research. • ROs with a range of weaknesses and deficiencies, and limited efforts to remove them. RO scaling is discussed by: • Representatives of the provider31 (the representatives of the expert advisory body of the provider/promoter if invited to discussions by the provider/promoter, plus, for example, the representatives of the National Accreditation Authority in the case of university evaluation); • Representatives of the RDI Council/RDI Section; • Heads (deputy heads) of panels, or experts; • Representatives of the Czech Rectors Conference, in the case of universities. 31 If the provider is different from the promoter, representatives of the promoter are also invited. The scaling is the result of joint discussions between the provider/promoter, the RDI Council/RDI Section and the representatives of expert panels, plus the representatives of the Czech Rectors Conference if a university is discussed. In the discussions, the RDI Council/RDI Section should take particular account of national evaluation results, with special emphasis on the robust evaluation of results through a combination of selected tools in Module 1 and Module 232 that providers adjust by their organisations' missions and the progress therein. The format expects mutual agreement but it is the provider's power - in accordance with law - to determine the amount of aid,33i.e. even a RO evaluated as below average may, if duly justified, receive better aid than would correspond to its evaluation (consequently, the existing practice of linking mechanically the funding and the evaluation, and thus reducing evaluation to a tool for financial aid quantification, is discontinued). The relevant persons will comment on whether a RO in the given governmental department meets the criteria for the given rating according to the sum of the annual evaluations for the past period conducted at national level, and on the basis of full evaluations prepared for each segment of R&D&I. The Recapitulative Provider-level Reports for the previous period, experts' opinions, and the provider's opinion will be the basis for discussions.34 The resulting evaluation will thus take account of both the results achieved, and the RO s mission and role in the system of R&D&I (see Fig. 7). A report is to be prepared to evidence the result of RO evaluation, and must include the basic identification data for the underlying documents, the method and result of evaluation, and reasons. The parties concerned make their comments on whether the RO meets the qualitative rating proposed. The evaluation's result and recommendations 32 See the description of the implementation period phasing in. 33 Scaling is part of the input information for the provider to decide the amount of LCDRO, see section 5a(2}(b} of Act No. 130/2002 Coll. on the support of research and development from public funds and on the amendment to some related acts (the Act on the Support of Research and Development} 34 If the provider is different from the promoter, the promoter's opinion is also taken into account. 28 Wl RESEARCH, DEVELOPMENT AND INNOVATION COUNCIL Government of the Czech Republic will be discussed with the management of the evaluated RO, plus the provider (for state universities). The comments on and the discussion with the RO over the evaluation result are taken into account in the full evaluation, which determines the framework of funding for the next five years. If the RO disagrees with the final report, it may file, and must give reasons for, an appeal to the RDI Council within a defined time-limit and request that the evaluation be re-discussed. If the provider or the RDI Council/RDI Section finds this appeal well-grounded, the evaluation of the RO will be re-discussed, with the RDI Council and the provider taking part.35 If a university lodges an appeal and gives reasons, this appeal is always discussed (given the importance of the evaluation for the 35 If a ministry is the provider/promoter, it is desirable to ensure that the appeal be discussed at an organisational level of the provider/promoter higher than that which has adopted the original opinion. university accreditation procedure conducted by the National Accreditation Authority), with representatives of the Czech Rectors Conference taking part This rule applies to universities even during the implementation period. Annual evaluations, limited in scope and informative capacity to the monitoring role, will be discussed at the level of the provider and the RDI Council. Underlying documents, including reasons, will be submitted to the provider for comment. It is up to the provider whether or not he prepares his comment in direct collaboration with beneficiaries. The underlying documents, the comments on them, and the discussion report will be published along with the scaling of institutions into bands A, B, C, and D. The RDI Council approves discussion results pursuant to section 35(2)(d) in accordance with the current wording of Act No. 130/2002 Coll. Once approved, the results will be published along with the reasons. Figure 7: Discussing Evaluation Results with Providers - Full Five-year Cycle Evaluation using All Modules National Level Recapitulative Reports for provider level Segments Full Evaluation Results Provider RDI Council/RDI Section Headsof panels / EXPERTS RO classification A, B, C, D n RO classification A, B, C, D Universities RO classification A, B, C, D 29 A.2 Annual Discussion over National Evaluation with Provider Once full evaluation is completed in a given segment, the annual evaluation provided by the RDI Council/RDI Section only serves the purpose of continual monitoring and adopting relevant measures, without revising distribution of organisations into groups A, B, C, and D. The exception is the evaluation during the implementation period 2017-2019 (see Chapter 4) because the 2017 evaluation will suggest informative RO scaling, which may be modified to match the 2018 evaluation results and, in particular, the 2019 international evaluation. ve grade on a year-on-year basis in the light of national evaluation. The fundamental underlying documents for discussions include the outputs from the previous annual selected results evaluations for the given interval between full evaluations. They also monitor trends, and recommend changes and measures as appropriate. They base their decisions on the Provider Level Recapitulative Report, experts' opinions, and the provider's opinion.36 The resulting evaluation will thus take account of both the results achieved, and the RO's mission and role in the system of R&D&I. The discussion procedure is similar to that described in 5.1, and the initial RO classification into A, B, C, and D by full evaluation in five-year cycles will be discussed. The parties concerned will comment on whether the ROs in 36 !f the provider is different from the promoter, the promo- the given governmental department meet this qualitati- ter's opinion is also taken into account Figure 8: Discussing Evaluation with Provider - Annual Evaluations National Level Recapitulative Reports for provider level Segments Initial Qualitative Grades for each RO Provider RDI Council/RDI Section Headsof panels / EXPERTS Monitoring Classification of ROs A, B, C, D Ii Governmental partments Monitoring Classification of ROs A, B, C, D Universities Monitoring Classification of ROs A, B, C, D 30 Wl RESEARCH, DEVELOPMENT AND INNOVATION COUNCIL Government of the Czech Republic 4.3 Annual Discussion of Evaluation with Provider in Implementation Phase The procedure in the implementation phase, i.e. before full evaluation with all modules is carried out, is similar to that in the regular annual evaluation (see Chapter 5.2) but takes into account a different initial situation in each R&D&I segment: CAS The initial qualitative grade will be available for the CAS segment, in which detailed evaluation will be completed in late 2016. Governmental Departments Detailed evaluation has been conducted by the Ministry of Interior, and other departments plan to complete full M17+ compliant evaluation in 2017 and 2018. Using the available documents and evaluations summed up in Chapter 4, discussions in the implementation period will propose classifying ROs into tentative qualitative grades A B', C, and D' if departments do not complete their full evaluations. Universities In the universities segment, evaluation modules are planned to be fully phased in by 2020. Tentative qualitative grades A B', C, and D' will be proposed for the implementation period, taking account that scaling is carried out by gradual accumulation of annual national evaluations and the phasing in of other modules, rather than full evaluation with all modules. The scaling during the implementation period, when only selected indicators in Modules 1 and 2 are used for national evaluation, is considered tentative and its primary purpose will be for ROs to know the quality of their selected results against the national standard. The tentative 2017 rating may change in 2019, as a result of the 2018 evaluation and, in particular, the international evaluation of full Modules 1 and 2. Discussion results are subject to approval by the RDI Council pursuant to section 35(2)(d) of Act No. 130/2002 Coll. Once approved, the results will be published along with the reasons. A.A Funding The following are the input data for discussions on budgetary chapters: • Annual report for the given chapter, prepared by the provider; • Recapitulative Provider Level Reports prepared by the RDI Section; • Previous qualitative category classification of each RO; • R&D&I NP; • Departmental policies, and National RIS 3 Strategies, where appropriate. The final decisions granting institutional aid to individual ROs are within the powers of the provider in accordance with Act No. 218/2000 Coll. An increase in the number of the ROs which receive aid constitutes no entitlement for the relevant provider to receive more LCDRO-related aid for the provider's budgetary chapter. The initial volume of this type of institutional aid for a brand new RO is decided by the provider within the LCDRO expenditures approved for the chapter. Then the provider should conduct a full evaluation as soon as possible - for a brand new RO after three to five years. If the institution exists, conducts R&D&I operations, but has not yet been recognised as a RO or is a RO but receives no LCDRO, the provider first conducts a full evaluation, and allocates LCDRO after R&D&I expenditures for the given budgetary chapter have been discussed. Any increase in the number of ROs, and thus an increase in the institutional aid funds, is to be reviewed by the provider at regular meetings discussing proposed R&D&I SB expenditures, which the RDI Council submits to the government. Ml7+ regulates the evaluation of ROs rather than determines the volume of aid for LCDRO. The evaluation result is just one item in the input data relevant for the funding of the given RO. The funding decision is the sole discretion of the provider. Any increase in the number of ROs, and thus a possible increase in the institutional aid funds (LCDRO), is to be reviewed by the provider at regular meetings discussing proposed R&D&I SB expenditures, which the RDI Council submits to the government. 31 5. EVALUATION IN THE UNIVERSITIES SEGMENT All universities will be evaluated in accordance with the procedures applicable to the universities segment. The government has assigned the RDI Deputy Prime Minister to fine-tune the procedure, in collaboration with the Minister for Education, Youth and Sports, the Defence Minister, the Interior Minister, and the representation of universities, and, by 31 December 2018, submit to the government the document fine-tuned to the degree of detail required for full evaluation in the universities segment and prepare the process for its implementation. 5.1 Evaluation Procedure 1) Evaluation covers any university that: a) by the date universities are invited to submit their aid application documents (the "documents"), has been registered, with the required particulars, in the List of Research Organisations administered by the MEYS pursuant to section 33a of Act No. 130/2002 Sb.;37 b) is within the scope of competence of the given provider in accordance with section 4(2)(a) of Act No. 130/2002 Sb.; c) submits complete documents in due time; if defects are identified in the documents, the universities which rectify the defects at the provider's request will also be evaluated. 2) Universities will be invited to submit their self-evaluation reports following the structure of Modules 3-5, and other documents facilitating evaluation in all modules. An important indicator in the Strategy and Policies module is how a university conducts self-evaluation, whether it has established an international advisory body, and how it ensures human resources development. Faculty, or group of faculties, or institutes is the university unit of evaluation for full evaluation. 37 By 1 July 2017, when section 33a of Act No. 130/2002 Coll. on the support of research and development from public funds and on the amendment to some related acts (the Act on the Support of Research and Development} takes effect. 3) The documents are evaluated by phases: a) Completeness of application and data pursuant to section 14(3) of Act No. 218/2000 Sb.; b) Self-evaluation report evaluation by "peer review" in collaboration with the MEYS and with the assistance of the expert advisory body - the composition of the expert advisory body must be published prior to evaluation; c) If a university's self-evaluation report fails to be approved, the university revises the report and submits for re-approval. 4) The evaluation under 3 may be accompanied by an onsite assessment of the faculty, group of faculties, or university institute being evaluated. 5) The next evaluation is conducted five years after the last evaluation. Any subsequent evaluation must also evaluate the progress the organisational unit has made since the previous evaluation. The results of annual evaluations will be used as one of the documents. 5.2 Assigning Qualitative Grades • Universities are rated with the basic qualitative grades A, B, C, and D on the basis of the five-year full evaluations in 2020, 2025, 2030, etc. Result evaluation by Ml and M2 weighs significantly in the universities segment. • Annual national evaluation monitors whether the RO performs to the qualitative grade achieved in the last full evaluation. • Before the first full evaluation, i.e. in 2017-2019, tentative grades A, B', C, and D' are assigned using the national evaluation and the phasing in of other modules (see the Universities Time Schedule Chart). 5.3 Discussing Evaluation Results A report is to be prepared to evidence the evaluation result for each university evaluated; the report must include the basic identification data for the underlying documents, the method and result of evaluation, and reasons. 32 Wl RESEARCH, DEVELOPMENT AND INNOVATION COUNCIL Government of the Czech Republic The evaluation's result and recommendations are discussed with the management of the evaluated university, plus the provider (for state universities). The parties concerned make their comments on whether the university meets the qualitative rating proposed. The evaluation conclusions are also supplied to the National Accreditation Authority for further use. The evaluation result may be appealed by the procedure described in Chapter 4.1. 6. EVALUATION IN SEGMENT OF GOVERNMENTAL DEPARTMENTS Ml7+ is the first system to connect two hitherto independent lines of evaluation - RO result evaluation conducted by the RDI Council/RDI Section, and the evaluation of RO long-term development policy conducted by the provider pursuant to Act No. 218/2000 Sb., on budgetary rules and amendments to some related acts (the Budgetary Rules), as amended, as part of evaluating underlying documents for the granting of aid.38 M17+ in Appendix 1 in Chapter 1.2 Evaluation Principles defines the minimum evaluation scope, conditions and criteria for specification in departmental methodologies that the relevant provider, in accordance with the framework document M17+, specifies, or supplements with the criteria and procedures according to the focus of the RO which receives institutional aid from the provider. Ml 7+ in Appendix 1 is thus a joint methodology for providers of LCDRO institutional aid, which methodology providers should fine-tune and detail to match their focus and needs while preserving the principles of evaluation. If the provider is different from the promoter of the RO, the provider always asks for the promoter's opinion on the evaluated RO. Promoter's representatives take part in the evaluation process. 6.1 Purpose of Evaluation The purpose of the evaluation of ROs and their results is defined in Acts nos. 130/2002 Coll. and 218/2000 Coll. The purpose is to provide institutional aid to LCDRO in accordance with valid legislation, and obtain information for managing the system of R&D&I in the Czech Republic, along with the information necessary for providers to meet their roles and for RO management to manage their ROs over a long term. The provider evaluates all the ROs which: a) Are registered in the public administration information system "List of Research Organisations" administered by the MEYS pursuant to section 33a of Act No. 130/2002 Coll. as at the date on which ROs are invited to submit their underlying documents for institutional LCDRO aid (the "documents") that show the required particulars; b) Are within the scope of competence of the given provider in accordance with section 4(2) (a) of Act No. 130/2002 Sb.; c) Submit complete documents in due time; if defects are identified in the documents, the ROs which rectify the defects at the provider's request within 14 calendar days will also be evaluated. 38 Section 3[3][a] of Act No. 130/2002 Coll. on the support of research and development from public funds and on the amendment to some related acts (the Act on the Support of Research and Development}. 6.2 Provider Five-year Evaluation Cycle and Relation to Annual Evaluation of R0 Results For the LCDRO assessment in 2017, the provider specifies the purpose of the subsidy, and monitors the progress in the accomplishment of the purpose while using the RO 33 results evaluation conducted each year by the RDI Coun-cil/RDI Section. In 2022, the provider will first assess the implementation of the previous long-term RO development policy for 2018-2022, and use these evaluation results to evaluate the LCDRO for next five years. The provider will publish the results of the initial and the final evaluations. The time schedule for RO evaluation in the departmental segment shows that the first evaluation cycle will be completed by 2018. For this purpose a detailed departmental evaluation methodology has been prepared and is attached to this document as Appendix 1. 6.3 Evaluation Underlying Documents This part describes the underlying documents for an evaluation according to criteria in accordance with the Ml7+ modules, which must be observed in the specification by relevant providers. 1. The RO must exist as a legal entity for a minimum period of five years;39 if a RO is merged, consolidated or divided, the duration of the original RO is included in the five-year requirement for the RO's legal successor. 2. The required institutional LCDRO aid must be in compliance with the European legislation regulating ROs as beneficiaries of state aid, in particular clauses 17-23 of article 2.1 of the Framework for State Aid for Research and Development and Innovation (2014/C 198/01). 3. The given RO proves the purpose of subsidy through the LCDRO, so it contains mainly the data necessary for the assessment of the subsidy (and is primarily targeted at the future, unlike the evaluations of ROs' results conducted by the RDI Council/RDI Section in the past five years). 39 Where appropriate, the provider may reduce this time. 7. EVALUATION OF CAS 7.1 Current Situation One of the most important tasks for the managements of CAS and its research institutes is permanent emphasis on achieving qualitative improvements in research and expert activities, making research institutes participate in international research, and duly performing other CAS roles defined in legislation. In order to ascertain how well this task is performed, the management of CAS has been organising regular evaluations of CAS' research institutes since its establishment in 1993. These evaluations are also used for the differentiated institutional funding of CAS research institutes. The implementation of the evaluation of the research and expert activities of CAS research institutes for 2010-2014 was decided by the Academy Council of CAS on 6 October 2014 following broad discussion, including discussions in the Academy Council of CAS. The comprehensive evaluation of the research and expert activities of CAS' research institutes for 2010-2014 was conducted in 2015. The results became available in 2016. The usability of these results for managing the R&D&I system at the national level is discussed in the RDI Council/RDI Section. 7.2 Objectives, Principles and Content of Evaluation The Academy Council of CAS has denned three main objectives for this evaluation: 1. Obtain qualitative and quantitative information on the situation of research at CAS in the period 2010-2014 in the national, European and global contexts. 2. Obtain information for the strategic management of CAS as a whole, including the funding of research institutes as a partial aspect of management. 3. Obtain independent and comparable evaluation and feedback for managing the research institutes and teams of CAS. 34 Wl RESEARCH, DEVELOPMENT AND INNOVATION COUNCIL Government of the Czech Republic The requirements for evaluation are defined in a currently applicable document titled "Basic Principles for the Evaluation of Research and Expert Activities of CAS Research Institutes for 2010-2014", which was approved by the Academy Council of CAS.40 Evaluation conducted in accordance with these principles fully complies with the minimum standards of evaluation in all modules required for management level II evaluation. Elaborated to the degree of detail required for management level III evaluation, it also provides structured information up to the level of the teams of each RO. 40 See complete documents Hodnocení - Akademie věd České republiky, [online]. Copyright © Středisko společných činností AV ČR, v. v. i. [online cit. 4 January 2017]. Available from: http://www.avcr.cz/cs/o-nas/hodnoceni/ 7.3 Future Development Discussions are conducted about establishing closer ties between the RDI Council/RDI Section evaluation and the CAS evaluation, sharing information, collaborating in data processing (the question of using the ASEP information system for evaluations under M17+), determining the degree of an appropriate use of evaluation conclusions, or allowing active participation in the evaluation processes. Throughout the validity of Act No. 130/2002 Coll. on public funding of research, experimental development and innovations, and amendments to some related acts (the Research, Experimental Development and Innovation Aid Act) as amended, which requires that results of all ROs be evaluated annually, it is also required that CAS takes part in national R&D&I evaluation as specified in this document. CAS plans to have the next round of comprehensive evaluation in 2020, when comprehensive evaluation of universities by all the modules proposed will also have been completed. 8. EVALUATION OF PURPOSE-TIED AID PROGRAMMES Programme evaluation is a separate evaluation discipline. Programme evaluation must, in each programme phase (prior to announcement, after termination, and during the programme where applicable), exactly correspond to the focus of the given programme. Adopting this view, the government approved the Basic Principles for Preparing and Evaluating Programmes and Groups of Research, Development and Innovations Grant Projects, in Government Resolution No. 351 of 13 May 2015. Exact conditions for the evaluation of each programme (time schedule, ways and methods of evaluation), including appropriate indicators allowing determining the degree of objective accomplishment, must be specified as part of each new purpose-tied programme, subject to approval, and must be defined with regard to the aforesaid Basic Principles in the context of the given programme. Purpose-tied programmes will be evaluated as follows: • The programmes submitted to the government for approval in 2020 and later will be prepared and evaluated by the principles approved under Government Resolution No. 351, Part I of 13 May 2015. • Existing programmes will be evaluated by the RDI Council in accordance with section 35(2)(d) of Act No. 130/2002 Sb., with an appropriate application of the basic principles for preparing and evaluating programmes and groups of research, development and innovations grant projects approved under Government Resolution No. 351 of 13 May 2015.141 Providers must 41 Appropriate application means using the Principle to the maximum extent possible while respecting the limitations due to the fact that the programmes had been prepared and approved by the government before the Principles were defined. In most cases, the programmes lack objective performance indicators and their initial values, evaluation method and time schedule, parameters for monitoring beyond the statutory duty in the ISREDI, etc. 35 cooperate with the RDI Council for programme evaluation.42 • According to a task resulting from Part II of Government Resolution No. 351 of 13 May 2015 concerning 42 Cooperation means providing information relevant for the given programme beyond the statutory duty to inform in the ISREDI if the provider has such information available or can obtain it more swiftly and effectively than the RDI Council. the basic principles for preparing and evaluating programmes and groups of research, development and innovations grant projects, these principles will be reflected in the Research, Experimental Development and Innovation Aid Act through a change to the compulsory content of the proposal for purpose-tied aid programmes. A detailed elaboration of the principles, in the form of a specific evaluation proposal, will be part of each new programme submitted to the government for approval in 2020 and the following years. Abbreviations CAS the Czech Academy of Sciences LCDRO Long-term Conceptual Development of Research Organisations ISREDI Information System for Research, Experimental Development and Innovations Ml-5 Modules 1 to 5 M13-16 Methodology for evaluating the results of research organisations and the results of completed programmes, valid for 2013-2016 M17+ Methodology for evaluating research organisations and research, development and innovations purpose-tied support programmes MEYS Ministry of Education, Youth and Sports R&D&I NP National Research, Development and Innovation Policy for 2016-2020 RDI Council Research, Development and Innova- tion Council RDI Section Research, Development and Innova- tion Section of the Office of the Government of the Czech Republic R&D&I SB State Budget for Research, Development and Innovation SSHA Social Sciences, Humanities and Arts R&D&I System of Research, Development and Innovation RO Research Organisation 36 Wl RESEARCH, DEVELOPMENT AND INNOVATION COUNCIL Government of the Czech Republic APPENDIX 1: METHODOLOGY OF EVALUATING RESEARCH ORGANISATIONS IN GOVERNMENTAL DEPARTMENTS Introduction The methodology of evaluating departmental research organisations according to this appendix is a regulation determining the minimum evaluation extent, conditions and criteria to be specified by providers (or extended to match the focus of the ROs receiving institutional aid from the provider). M17+ is thus ajoint methodology for providers of LCDRO institutional aid, which methodology providers should fine-tune and detail to match their focus and needs while preserving the principles of evaluation. Ml7+ uses all the experience from the existing institutional aid evaluations since 1999 (from research project evaluations to the 2013-2016 Methodology), suggestions from the IPn Methodology, results from the 2015 evaluation of CAS research institutes, international experience, and other sources. M17+ is also the first methodology to connect two hitherto separate lines of evaluation - the evaluation of ROs' results conducted by the RDI Council/RDI Section of the Office of the Government of the Czech Republic, and the evaluation of long-term RO development policies conducted by providers as part of assessing the documents for granting aid. If the provider is different from the promoter of the RO, the provider always asks for the promoter's opinion on the evaluated RO. Promoters' representatives take part in the evaluation process. 1. EVALUATION IN SEGMENT OF GOVERNMENTAL DEPARTMENTS - GENERAL CONSIDERATIONS 1.1 Purpose of Evaluation The purpose of the evaluation of ROs and their results is defined in Acts nos. 130/2002 Coll. and 218/2000 Coll. The purpose is to provide institutional aid to LCDRO in accordance with valid legislation, and obtain information for managing the system of R&D&I in the Czech Republic, information necessary for providers to meet their roles, and for RO management to manage their ROs over a long term. 1.2 Evaluation Principles This part sums up the basic evaluation principles, which providers must observe in the specification of M17+. 1. The provider evaluates the RO, and the RO's results are evaluated by the RDI Council/RDI Section of the Office of the Government of the Czech Republic The provider of LCDRO institutional aid evaluates the ROs application for subsidy and the documents for the granting of aid pursuant to section 14(3) of Act No. 218/2000 Sb., particularly the purpose of subsidy described in the LCDRO, through peer review by expert advisory body (bodies). If the provider is different from the RO's promoter, the promoter's representatives are involved in the provider's expert advisory bodies. The RDI Council/RDI Section arranges for, in particular, annual evaluation of ROs' results using the ISREDI, pursuant to section 35(2)(d) and (h) of Act No. 130/2002 Coll. 2. The amount of LCDRO aid for providers is denned by the R&D&I SB for the given year The amounts of LCDRO aid for the respective budgetary chapters result primarily from the discussions on the draft state budget for research, development and innovation based on the valid R&D&I NP,43 a mid-term budgetary outlook, etc., rather than just the evaluati- 43 Office of the Government of the Czech Republic. National Research, Development and Innovation Policy of the Czech Republic, 2016-2020 approved under Government Resolution No. 135 of 17 February 2016. ISBN: 978-80-7U0-U3-5. 37 on of ROs' results conducted by the RDI Council/RDI Section. Under section 5a(2)(b) of Act No. 130/2002 Sb., the evaluation of ROs' results is one of the input data for the preparation of the first draft R&D&I SB. Given the different focus and purpose of the ROs under the purview of each provider, the evaluation cannot be directly reflected in the draft R&D&I SB. For discussing the matter with providers, the RDI Council/RDI Section will prepare annual evaluation and a clear analytical document for the given sphere of research prepared for monitoring long-term RDI development in the given sphere, in applied research in particular. 3. Provider five-year evaluation cycle For the LCDRO assessment in 2017, the provider specifies the purpose of the subsidy, and monitors the progress in accomplishing the purpose, while using the RO results evaluation conducted each year by the RDI Council/RDI Section. In 2022, the provider will first assess the implementation of the previous LCDRO in 2018-2022, and use these evaluation results to evaluate the LCDRO for next five years. The provider will publish the results of the initial and the final evaluations. 4. Initial aid situation and changes Provided that the given RO is put on the List of Research Organisations pursuant to section 33a of Act No. 130/2002 Coll. and is evaluated by this M17+ specified by the provider, with the provider allowed to modify the initial aid by no more than -5/+10 per cent each year. Any increase in the number of the ROs to receive aid constitutes no entitlement for an increase of LCDRO aid for the provider's budgetary chapter. 5. Procedure in terminating purpose-tied aid for activities provided by some ROs If a government resolution terminates purpose-tied aid for activities provided by some ROs, i.e. the purpose-tied funds are carried over in the institutional funds, such as aid for National Sustainability Programme I and II under Government Resolution No. 1067 of 21 December 2015, the funds will be carried over to specific ROs by increasing their expenditures. 6. The results of provider evaluations are primarily to specify the focus of development for the given RO If any part of the LCDRO fails to be approved in the provider's evaluation, the RO must revise the LCDRO taking account of the objections, and the provider must evaluate the revised LCDRO. The provider may appropriately reduce the aid only if the revised LCDRO fails to be approved. 7. Procedure where evaluation is not conducted If, for any reason, the RDI Council/RDI Section or the provider does not conduct the evaluation, the results of the last evaluation apply. 8. Connections between provider evaluation and the preparation of draft R&D&I SB In order to prepare draft R&D&I SB, the RDI Council/RDI Section has the right to request from the provider complete underlying documents for the provider-level evaluation. 2. EVALUATION MILESTONES The following chapters chronologically describe the ten- funding (after approval of M17+ and providers' methodo- tative deadlines 2 for the stages of evaluation and LCDRO logies). 2.1 R0 Evaluation by LCDRO Conducted by Provider No. Stage/Activity Responsible Deadline Input Evaluation of Strategies for 2018-2022 Conducted in 2017 1. RO invited to submit documents for granting aid containing all the reguired information and specifying the maximum amount of aid Provider 31 May 201 7 2. Submit documents for granting aid containing all the reguired information RO 31 August 2017 38 Wl RESEARCH, DEVELOPMENT AND INNOVATION COUNCIL Government of the Czech Republic No. Stage/Activity Responsible Deadline 3. Check whether the subsidy application and data are complete and the aid criteria fulfilled [evaluation phases 1 and 2) Provider 30 September 2017 1*. Assess LCDRO by peer review evaluation by expert advisory body [evaluation phase 3) Provider 15 December 2017 5. Pass the Czech Republic state budget bill and pass LCDRO expenses for the following year [Chamber of Deputies) December 6. Issue and publish the decision to grant institutional aid for the years 2018-2022 Provider 31 January 2018 Continuous evaluation for 2018-2021 conducted in 2019-2022 7. LCDRO implementation and use of aid progress report RO 5 January* 8. Assess the LCDRO implementation progress report by expert advisory body, and the use of the aid in the past year Provider 28 January 9. Issue and publish the amended decision to grant institutional aid for the given year Provider 31 January Final evaluations for 2018-2022 conducted in 2023 10. Final LCDRO implementation report, including progress in the objectives defined, the results realised, and the use of aid forthewhole project period RO 5 January 2023 11. Evaluate the final LCDRO implementation report by expert advisory body, and the use of aid in 2018-2022 [to be evaluated along with the proposal for 2023-2027). Provider 30 April 2023 12. Publish final evaluation Provider 30 June 2023 Notes: * The progress report for 2018 by 5 January 2019 and so forth. 2.2 Procedure in RO Evaluation by LCDRO Conducted by Provider 1. The provider evaluates all the ROs which: a) by the date ROs are invited to submit their application documents for LCDRO institutional aid (the "documents"), have been registered, with the required particulars, in the public administration information system titled "List of Research Organisations" administered by the MEYS pursuant to section 33a of Act No. 130/2002 Sb.;44 b) are within the scope of competence of the provider in accordance with section 4(2)(a) of Act No. 130/2002 Sb.; c) submit complete documents in due time; if defects are identified in the documents, the ROs which rec- 44 By 1 July 2017, when section 33a of Act No. 130/2002 Coll. on the support of research and development from public funds and on the amendment to some related acts (the Act on the Support of Research and Development} takes effect. tify the defects at the provider's request within 14 calendar days will also be evaluated. 2. The provider invites the ROs under the provider's purview by 31 May 2017 to submit the LCDRO institutional aid application documents for 2018-2022 showing all required information, including the maximum subsidy amount for each RO in each year; the provider takes account of the subsidy amounts granted in 2016, which the provider may each year modify by -5/+10 per cent according to his evaluation (this limit does not include any increase/decrease in the provider's LCDRO expenditure in R&D&I SB. 3. The ROs which received no aid in 2016 may only be granted aid if they meet all the requirements of the documents, have been positively evaluated by the provider45, and the provider has funds available in his chapter for the relevant 45 That is, the provider's evaluation considers the given RO as qualified for being assigned LCDRO. 39 period to grant aid to such ROs (any increase in the number of ROs receiving aid does not constitute any entitlement to a higher LCDRO aid for the providers budgetary chapter). In 2017, the provider evaluates these ROs along with other ROs, and when the documents are submitted in the following years in 2018-2022, the provider will make a separate evaluation similar to that conducted in 2017. 4. In 2017, the provider evaluates the documents progressively by evaluating only those which have passed the previous phase; the phases are the following: a) Completeness of subsidy application and data pursuant to section 14(3) of Act No. 218/2000 Sb.; b) Satisfaction of the criteria for the granting of LCDRO institutional aid - the criteria are i) RO must exist as a legal entity for a minimum period of five years;46 ii) the LCDRO institutional aid required must be in compliance with European legislation; c) Peer review LCDRO evaluation by expert advisory bodies, the number, structure and decision-making procedure of which the provider specifies in sufficient detail. The composition of the expert advisory body must be published prior to evaluation. d) If any part of the LCDRO fails to be approved in the evaluation, the RO must revise the LCDRO, and the provider must evaluate the revised LCDRO. The deadlines for submitting the revised strategies and having them evaluated will be determined by the provider appropriately to the extent of revision. Only if the revised strategies fail to be approved may the 46 Where appropriate, the provider may reduce this time. provider reduce the aid in a manner corresponding to the non-approved expenditures and distribute the funds among other ROs under the provider's purview. 5. The provider prepares a report to evidence each evaluation phase in 2017; the report must include the basic identification data for the underlying documents, the method and result of evaluation, and reasons. The RO will receive the report with evaluators' personal data deleted. 6. Using the evaluations conducted in 2017, the provider issues a decision granting LCDRO institutional aid for 2018-2022 and publishes the decision on the provider's website. 7. By the deadline determined by the provider, the RO is to submit each year a LCDRO implementation and use of aid progress report for the previous year, and the provider evaluates this report via the expert advisory body. Using this progress report, the provider issues an amended decision for the given year, in which the provider may change the amount of aid for each RO by -5/+10 per cent of the expenditures (this limit does not include any increase in the provider's LCDRO expenditures). 8. The RO submits its final LCDRO implementation report, including the progress in the objectives defined, the results realised, and the use of aid for all the project period, by the date as determined by the provider. 9. The provider ensures final evaluation of LCDRO implementation for 2018-2022 through peer review via the expert advisory body or bodies. 10. The responsibility for publishing the final 2022 LCDRO implementation report lies with the provider. 3. INPUT EVALUATION FOR 2018-2022 CONDUCTED IN 2017 3.1 Request for RO to Submit Documents with Defined Particulars and Maximum Amount of Aid The provider invites the ROs within its purview by 31 May 2017 to submit their documents in 2018-2022 that contain: a) Subsidy application pursuant to section 14(3) of Act No. 218/2000 Coll. that contains the following required information 1. Name, seat and identification number of the RO as a legal entity; 2. Name and address of the provider; 3. Required amount of aid for each year that must not be higher than the maximum aid determined by the pro- 40 Wl RESEARCH, DEVELOPMENT AND INNOVATION COUNCIL Government of the Czech Republic vider according to the subsidy amount granted to the RO in 2016; the provider may each year modify this amount by -5/+10 per cent according to his evaluation (this limit does not include any increase/decrease in the provider's LCDRO expenditure in R&D&I SB); 4. Purpose of subsidy - by reference to the RO's LCDRO; 5. Date by which the purpose of the subsidy should be achieved, i.e. in 2018-2022; 6. For RO as a legal entity, the identification data for i) Persons acting on behalf of the RO, and whether they act in the position of authorised governing body or under power of attorney; ii) Persons holding a stake in this legal entity; iii) Persons in which the RO holds a stake, and the amount thereof; b) Documents for the evaluation by the criteria for the granting of LCDRO institutional aid - the criteria are 1. The RO must exist as a legal entity for a minimum period of five years;47 if a RO is merged, consolidated or divided, the duration of the original RO is included in the five-year requirement for the RO's legal successors; 2. The required institutional LCDRO aid must be in compliance with the European legislation regulating ROs as beneficiaries of state aid, in particular clauses 17-23 of article 2.1 of the Framework for State aid for research and development and innovation (2014/C 198/01); c) Long-term RO development policy for 2018-2022. The given RO proves the purpose of subsidy by submitting its LCDRO, so it contains mainly the data necessary for the assessment of the subsidy (and is primarily targeted at the future, unlike the evaluations of ROs' results conducted by the RDI Council/RDI Section in the past five years). The following information is the common minimum information for LCDRO of all providers that the relevant provider is to specify for the ROs within the provider's competence 1. Basic identification data (document name, RO name, period);48 2. Comprehensive section i) Historic and current details of the RO - basic information; 47 Where appropriate, the provider may reduce this time. 4S For instance, Long-term Development Policy for Research Organisation, Research Institute 2018-2022} ii) Overall policy goal for the whole RO and the goal's ties to the provider's policy;49 iii) Total institutional LCDRO funds required by the RO and broken down by year and eligible cost (pursuant to section 2(2)(k) of Act No. 130/2002 Coll.); iv) Other resources for RO research development (purpose-tied aid, funds from ESIF and other structural funds, international funds, proceeds from contracted research, etc.); v) RO's international and national collaboration, collaboration with the users of research results; vi) RO's other specific research activities and the activities related thereto (training, expert activities, etc.). 3. Fields of expertise researched by the research teams of the RO, structured into 49 According to the currently valid research and development policy for institutional aid providers or other policies specified by the provider: Ministry of Culture - Interdepartmental policy for applied research and national and cultural identity development for 2016-2022 (Government Resolution No. 886 of 27 November 2013}; Ministry of Health - Healthcare Research Policy up to 2022 (Government Resolution No. 58 of 22 January 2014}; Ministry of Agriculture - Ministry of Agriculture's Research, Development and Innovation Policy for 2016-2022 (Government Resolution No. 82 of 3 February 2016}; Ministry of Defence - Defence Applied Research, Development and Innovation Policy for 2016-2022 (Government Resolution No. 246 of 21 March 2016}; Ministry of Interior - Interdepartmental Security Research Policy 2009-2015, prolonged up to 01 2017 by National Security Council Resolution No. 32/2015 on the Development of Security Research Aid System after 2015. CAS - has its own system of evaluation, its policy "CAS Strategy 21" was approved at Session XLVofthe CAS Academy Assembly on 16 December 2014; Ministry of Education, Youth and Sports, and Ministry of Industry and Trade have no separate government-approved R&D&I policies, i.e. the ministries specify other departmental policy-related documents approved by the government. In 2016, the RDI Council approved four R&D&I policies of new institutional aid providers: Ministry of the Environment - Ministry of the Environment's Research and Development Policy for 2016-2025, Ministry of Transport - Transport Research, Development and Innovation Policy up to 2030, Ministry of Labour and Social Affairs - Research and Development Policy of the Ministry of Labour and Social Affairs, Ministry of Foreign Affairs - Research and Development Policy of the Ministry of the Foreign Affairs for 2016-2025 41 i) Field of research; ii) Sub-goal of the policy for 2018-2022 for the field of research, and the controllable objectives for each year; iii) Composition of the team conducting the research (names, job titles and workloads of the RO's employees or students); iv) Major results in the given field of research realised in the previous five years; v) Expected results in the field of research and the period of their application in 2018-2022. 3.2 Submitting Documents with All Data Required The RO must submit to the provider its documents meeting all the requirements defined by this methodology by 31 August 2017, in the manner as defined by the provider in the submission request. 3.3 Provider Evaluating Completeness of the Subsidy Application and Data (Evaluation Phase 1) a) By 30 September 2017, the provider checks the completeness of the documents specified in this methodology. b) If defects are identified in the subsidy application, the ROs which rectify the defects at the provider's request within 14 calendar days will also be evaluated. 3.4 Provider Evaluating the Meeting of the Criteria for the Granting of Subsidy (Evaluation Phase 2) The meeting of the following criteria for the granting of subsidy must be evaluated by the provider by 30 September 2017: a) The RO must exist as a legal entity for a minimum period of five years;50 if a RO is merged, consolidated or 50 Where appropriate, the provider may reduce this time. divided, the duration of the original RO is included in the five-year requirement for the RO's legal successors; b) The required institutional LCDRO aid must be in compliance with the European legislation regulating research organisations as beneficiaries of state aid, in particular clauses 17-23 of article 2.1 of the Framework for State aid for research and development and innovation (2014/C 198/01). 3.5 Provider Evaluating LCDRO by Peer Review Evaluation through Expert Advisory Body (Evaluation Phase 3) a) LCDRO must be evaluated by the provider by 15 December 2017, by peer review evaluation through expert advisory bodies, the number, structure and evaluation process of which the provider modifies to meet its needs. The composition of the expert advisory body must be published no later than after the evaluation is finished. b) The main RO evaluation criteria are these:51 1. Research environment (the standard of the RO's policies and how the provider's policies are implemented, the conditions and prerequisites for research, etc.); 2. International and national collaboration (the RO's collaboration with other research organisations); 3. Research excellence (the evaluation of selected results of the RO, other specific research activities of the RO); 4. Research performance (collaboration with the users of R&D&I results, resources obtained outside LCDRO, effective use of the funds requested, etc.); 5. Social relevance and impacts of research. In order to take account of its specifics, the provider may add other criteria, such as by using IPn Methodology52 outputs or other sources. 51 The main RO evaluation criteria must be based on the module specifications according to M17+. The RDI Council checks whether providers' methodologies are in compliance with the framework defined by M17+. 52 See Research and Development Evaluation Methodology and Funding Principles (Comprehensive Report}, Part 3.2.5 Overview of Evaluation Criteria, pp. 53-58 incl. Fig. 15 ..List of Main Indicators and Their Relevance to Types of RO". See: MSMT / IPN METODIKA. MSMT / IPN METODIKA [online]. Available from: http://metodika. reformy-msmt.cz/ U2 Wl RESEARCH, DEVELOPMENT AND INNOVATION COUNCIL Government of the Czech Republic c) If any part of the LCDRO fails to be approved in the evaluation, the RO must revise the LCDRO, and the provider must evaluate the revised LCDRO. The deadlines for submitting the revised strategies and having them evaluated will be determined by the provider appropriately to the requested extent of revision. Only if the revised LCDRO fails to be approved may the provider reduce the aid in a manner appropriate to what is not approved. d) The provider prepares a report to evidence each evaluation phase in 2017; the report must include the basic identification data for the underlying documents, the method and result of evaluation, and concrete reasons. The RO will receive the report with evaluators' personal data deleted. 3.6 Issue and Publish Decision to Grant LCDRO Institutional Aid Using the evaluations conducted in 2017, the provider issues a decision granting LCDRO institutional aid for 2018-2022 by 31 January 2018, and publishes the decision on the provider's website. 4. CONTINUOUS EVALUATION FOR 2018-2021 CONDUCTED IN 2019-2022 4.1 Progress Report a) By the deadline determined by the provider, the RO is to submit each year a report on the progress in LCDRO implementation and use of aid in the previous year. b) The progress report includes the following, in particular: 1. Changes, if any, proposed to LCDRO, in the structure specified in this methodology, and reasons for the changes;53 2. Performance in the monitored objectives in the given year, as specified in this methodology;54 3. Achievement of expected results if any were planned for the given year according to this methodology.55 c) The provider evaluates the progress report through the expert advisory body. d) Using this progress report, the provider issues an amended decision for the given year, in which the provider may change the amount of aid for each RO by -5/+10 per cent of the expenditures (this limit does not include any increase in the provider's LCDRO expenditures). A.2 Changes during the Year a) If any change occurs during the year that the RO could not have foreseen and that affects the purpose or the amount of subsidy, the RO must request an amended, that is, a new decision, give reasons for the change, and provide all the documents pursuant to this methodology which are affected by the requested change.56 b) Evaluating the proposed change, the provider applies a procedure analogous to that applied to evaluating the progress report. 53 See 3. He] of Appendix 1 to M17+. 54 See 3.1 [c][3.ii] of Appendix Uo M17+. _ 55 See 3.1 [c][3.v] of Appendix 1 to Ml 7+. 56 See 3.1[c] of Appendix 1 to M17+. 43 5. FINAL EVALUATION FOR 2018-2022 CONDUCTED IN 2023 5.1 Final Report a) By the date determined by the provider, the RO submits its final report on LCDRO implementation for 2018-2022, including the use of aid in 2022. b) Final report must include: 1. Basic identification data (document name, RO name, period);57 2. Comprehensive section i) Assessing the implementation of the LCDRO overall goal for the whole RO, and the goal's ties to the provider's policy;58 57 For instance, Final Report on the Implementation of Long-term Development Policy for Research Organisation, Research Institute 2018-2022} 58 According to the currently valid research and development policy for institutional aid providers or other government -approved policies and strategies of the provider which the provider specifies: Ministry of Culture - Interdepartmental Policy For Applied Research and National and Cultural Identity Development for 2016-2022 (Government Resolution No. 886 of 27 November 2013}; Ministry of Health - Healthcare Research Policy up to 2022 (Government Resolution No. 58 of 22 January 20 U}; Ministry of Agriculture - Ministry of Agriculture's Research, Development and Innovation Policy for 2016-2022 (Government Resolution No. 82 of 3 February 2016}; Ministry of Defence - Defence Applied Research, Development and Innovation Policy for 2016-2022 (Government Resolution No. 246 of 21 March 2016}; CAS - has its own system of evaluation, its policy "CAS Strategy 21" was approved at Session XLV of the CAS Academy Assembly on 16 December 2014; Ministry of Education, Youth and Sports, Ministry of Industry and Trade, and Ministry of Interior have no separate government-approved R&D&I policies, i.e. the ministries specify other departmental policy-related documents approved by the government. In 2016, the RDI Council approved four R&D&I policies of new institutional aid providers: Ministry of the Environment - Ministry of the Environment's Research and Development Policy for 2016-2025, Ministry of Transport - Transport Research, Development and Innovation Policy up to 2030, ii) Total institutional LCDRO funds spent by the RO and broken down by year and eligible cost (pursuant to section 2(2)(k) of Act No. 130/2002 Sb.); iii) Other resources for RO research development in the past five years (purpose-tied aid, international funds, proceeds from contracted research, etc.) and their comparison with prerequisites, plus reasons for changes; iv) RO's international and national collaboration effected, collaboration with the users of research results; v) RO's other specific research activities and the activities related thereto (training, expert activities, etc.). 3. Fields of expertise researched by the research teams of the RO, structured into i) Field of research; ii) Implementation of the sub-goals of the policy 2018-2022 for the field of research and the controllable objectives for each year; iii) Composition of the team implementing the sub-goal (the name of the RO employee, or the student, and their workload) and changes to the team in the past five years; iv) Results realised in 2018-2022 and their comparison to expected results. 5.2 Final Evaluation a) The provider ensures final evaluation of LCDRO implementation in 2018-2022 through peer review via the expert advisory body or bodies by 30 April 2023. b) The final report on LCDRO implementation in 2022 must be published by the provider by 30 June 2023. Ministry of Labour and Social Affairs - Research and Development Policy of the Ministry of Labour and Social Affairs, Ministry of Foreign Affairs - Research and Development Policy of the Ministry of the Foreign Affairs for 2016-2025 Wl RESEARCH, DEVELOPMENT AND INNOVATION COUNCIL Government of the Czech Republic APPENDIX 2: CONFLICT OF INTERESTS AS TREATED BY CAS AND IPN METHODOLOGY 1. Conflict of Interests as Treated in CAS Evaluation Principles: Step 1 is to arrange heads and members of panels. A panel will be comprised of foreign researchers. The number of persons on a panel will differ according to the size and heterogeneity of the subject of research. The persons must be internationally recognised authorities with no conflict of interest vis-a-vis any CAS research institute in the given field of research. The list of nominated panel heads and members will be subject to approval by the CAS Academy Council and, once approved, panel heads and members will be appointed by the CAS president and make contracts with CAS. All panel members, including heads, and evaluators must confirm no conflict of interests in OIS prior to evaluation.59 Research institutes will be able to object to the persons appointed (a letter has been sent to research institutes requesting they should identify any unsuitable evaluator and give brief reasons). Conflict of interest: Definition of the conflict of interests for Reviewers - Head of Panel, Commission Chair and Deputy Chair, Panel Member, Commission Member and Evaluator: For the Research Evaluation Exercise 2015, held by the Czech Academy of S ciences, a conflict of interests exists if a Reviewer: a) Was involved in the preparation of/is co-author of the outputs and/or results to be evaluated (applies to Evaluators only), b) Has close family ties (spouse, domestic or non-domestic partner, child, sibling, parent, etc.) or other close personal relationship with any person who is co-author of the outputs and/or results to be evaluated and who is from the assessment unit to be evaluated, or with the 59 Online information system. head of assessment unit to be evaluated, or with any person representing legal entity to be evaluated, c) Is in any way involved in the management of any legal entity to be evaluated, d) Is employed or contracted by any legal entity to be evaluated, e) Has or has had a relationship of scientific rivalry or professional hostility with any co-author of the outputs and/or results to be evaluated, or with the head of assessment unit to be evaluated, f) Has or has had in the past, a mentor/mentee relationship with any co-author of the outputs and/or results to be evaluated who is from the assessment unit to be evaluated, or with any person from the legal entity or assessment unit to be evaluated. The Coordination Board, upon notification from the Reviewer, will decide whether a conflict of interest exists if any other situation (e.g. joint projects) appears that could cast doubt on the Reviewer's ability to participate in the evaluation impartially, or that could reasonably appear to do so in the eyes of an external third party. If it is revealed during an evaluation that a Reviewer has knowingly concealed a conflict of interest, the Reviewer will be immediately excluded. Any panel decision in which s/he has participated will be declared null, and the output(s) and/ or result(s) concerned will be re-evaluated. 2. Conflict of Interests as Treated in IPn Methodology Concrete Matters in the Operation of panels The following are the matters important to the operation of panels (main panels and subject-related panels): Conflicts of interests. All heads, members, evaluators, secretaries and expert advisers of main and subject-related panels must observe the measures to manage conflict of interest, if any. These persons must record their no bias statement and avoid any conflict of interest. Confidential information measures. All heads, members, evaluators, secretaries and technical advisers of main and expert 45 panels are bound by the terms and conditions of their confidentiality agreements. These agreements must ensure effective control and operation of the evaluation process. Conflicts of Interests The no bias statement is to be prepared by the steering team upon consulting with the Evaluation Management Council and the heads of the main panels. The no bias statement should address at least the issues described below. The no bias statement is to protect researchers and the rights of the RO, EvUn, RU60, panel members and any other persons involved the evaluation process. All the persons taking part in the evaluation process must fill in and sign the no bias statement. These persons include all the members of the Evaluation Steering Team, Steering Team, the secretariat of the panel, the secretariats of the main and the expert panels, evaluators, and expert advisers. All members of main panels and subject-related panels must specify all their close personal or professional relations to the RO, EvUn and RU in the discipline or sub-discipline in which they contribute to the evaluation process. For example, planned, recently terminated or honorary office in the RO, more than three joint publications with researchers from a single RO, or collaboration in a field of applied research and commercialisation. No bias statements will be analysed and discussed by the Evaluation Steering Team. The Steering Team proposes rectification to the heads of the main and the expert panels. Conflict of interests exists if an expert: a) Can directly or indirectly profit from the evaluation; b) Has close family or personal relations with any person employed with the organisation under evaluation; c) Was employed or contracted by the organisation under evaluation; d) Has taken part in research collaboration with the organisation under evaluation in the last five years; e) Has been a mentor to, or mentored by the employees of, the organisation under evaluation. 60 EvUn - evaluated unit, RU - research unit. Synthesis of the Bias Treatment Principles Putting the two aforesaid proposals together, observance of the no bias rules can be ensured through the following rules: 1. Each evaluator must confirm their consent to the no bias rules in relation to the evaluated result or output, its author or originator, or the institution which has submitted the output or result for evaluation. Any biased panel member must not take part in evaluating the particular result. 2. Before accepting a result for evaluation, the evaluator must confirm he or she is not biased. 3. If a situation may raise doubts about the no bias status of an evaluator, or the evaluator may appear as biased to a third party, the head of the panel decides, upon notice from the evaluator, whether or not the evaluator is in a conflict of interests. 4. If, during evaluation, it is established that the evaluator has violated the no bias rules, the outputs or results evaluated by that evaluator will be re-evaluated. Any decision made by the panel(s) in which that evaluator took part will be declared null and void. Conflict of interests exists, without limitation to, where the evaluator: 1. Was involved in the preparation or is a co-author of the outputs or results (applicable to evaluators only) he or she is to evaluate; 2. Has close family ties (wife, a partner whether or not living in the same household) or other close personal ties to (i) any person who is a co-author of the outputs or results to be evaluated and is a member of the unit to be evaluated, or (ii) the head of the unit to be evaluated, or (iii) any legal successor of the legal entity to be evaluated; 3. Is anyhow involved in the managing of any legal entity to be evaluated; 4. Is employed under employment contract or agreement with any legal entity to be evaluated; 5. Has had relations amounting to research rivalry or professional animosity with any co-author of the outputs or results to be evaluated, or with the head of the unit to be evaluated; 6. Has been a mentor/mentee in relation to (i) any co -author of the outputs or results to be evaluated who is a member of the unit to be evaluated, or (ii) any person from the legal entity or unit to be evaluated... 46 Government of the Czech Republic APPENDIX3: INSTITUTIONAL LCDRO EXPENDITURES IN 2017-2019 Table 2: Institutional LCDRO Expenditures - Increase in 2017-2019 Approved under Government Resolution No. 477 of 30 May 2016 (CZK) Budgetary Chapter 2016 pursuant to Act No. 400/2015, the State Budget Act Increase against 2016 Budget 2017 Increase under Government Resolution No. 477/2016 2018 Increase under Government Resolution No. 477/2016 2019 Increase under Government Resolution No. 477/2016 Ministry of Defence (MoD) 85 913 000 3 865 000 5 253 000 8 576 000 Ministry of Interior (MOI) 60 675 000 2 730 000 3 710 000 6 058 000 Ministry of Industry and Trade (MIT) 2U 980 000 9 671 000 13 144 000 21 463 000 Ministry of Agriculture (MA) 391 377 000 1 7 607 000 23 929 000 39 074 000 Ministry of Education, Youth and Sports (MEYS) LCDRO(1)* 5 770 877 000 251 500 902 341 835 426 558 103 352 Ministry of Education, Youth and Sports (MEYS) LCDRO(2)** -180 386 539 0 0 0 Ministry of Culture (MC) 84 880 000 3 819 000 5 159 000 8 474 000 Ministry of Health (MH) 637 079 000 28 660 000 38 952 000 63 603 000 CAS 3 401 674 000 153 032 000 207 983 000 339 610 000 Ministry of Transport (MT) 14 672 854 660 092 895 044 1 467 285 Ministry of Labour and Social Affairs (MLSA) 9 547 859 429 532 582 419 954 786 Ministry of Foreign Affairs (MFA) 9 530 993 428 773 581 390 953 098 Ministry of the Environment 146 634 833 6 596 701 8 944 721 14 663 479 Total 10 647 455 000 479 000 000 650 969 000 1 063 000 000 Increase (%)*** 4,50% 6,11% 9,98% Notes: MH, MLSA, MFA, MoE in 2016 - amounts allocated under MEYS are approved to be transferred to given chapters as from 2017, including the increase. * MEYS LCDROll): the LCDRO volume in 2016 including the amount for MH, MLSA, MFA and MoE allocated under MEYS (i.e. including the volume of MEYS LCDR012). ** MEYS LCDR012}: the LCDRO volume in 2016 for MH, MLSA, MFA and MoE allocated under MEYS. *** The increase [%} is the LCDRO increase against the 2016 budget rounded to two decimal points. Wl RESEARCH, DEVELOPMENT AND INNOVATION COUNCIL Government of the Czech Republic RESEARCH, DEVELOPMENT AND INNOVATION COUNCIL DEPARTMENT ISBN 978-80-7440-214-2 (online: pdf) ISBN 978-80-7440-206-7 (brož.) RESEARCH, DEVELOPMENT AND INNOVATION COUNCIL Government of the Czech Republic ISBN 978-80-7440-206-7 (brož.) ISBN 978-80-7440-214-2 (online: pdf)