• No results found

1   EVALUATION PROCESS

1.4   The Evaluation

The SRA initiative is a big investment, involving many universities and a great number of researchers within many different research areas and with very different prerequisites in terms of creating an international top quality research environment. Several of the 43 funded research environments already existed in an established academic context while others began to build up their activity with the SRA funding. To evaluate such a multifaceted initiative is therefore a difficult task.

Given this complex nature of the SRA initiative, preparations for the evaluation began with a pre-study in 2013 during which the project group worked intensively and in close collaboration with the steering-group to define the main questions for the evaluation and design the data collection by identifying the key components, activities, outputs and goals of the SRA initiative. The work group used the Government Bill, the commissions to the involved agencies and the call for proposals to identify the purpose(s) of the initiative, activities and outputs, intermediate and final outcomes of the SRA (see Appendix 3). This was made in order reduce the complexity of the SRA initiative and to achieve a logical summary of its key factors so that data collection and the analysis could be focused.

In December 2013 leaders of the SRA host universities were invited to a meeting where guiding principles and the overall design of the evaluation were presented. Detailed information about the evaluation was sent to the SRA research environments later the same month. The identified focal points of the assessment have also been discussed at a meeting with the Ministry of Education and Research in the early spring of 2014.

1.4.1 Data used for the evaluation

Multiple sources of data were used for the evaluation process of the Strategic Research Areas:

1) The original government call for proposals

2) The original application for SRA-grants from each research environment 3) 2010–2013 monitoring reports from the SRA research environments

Summaries of the 2010–2013 SRA monitoring data 2 for each SRA research environment were prepared by the agencies for the external reviewers and the expert panel. Each report summarised the overall development of the strategic research environment including overviews on personnel, sources of income, use of government funding, data on doctoral and licentiate degrees, conferences and visiting researchers. Also qualitative and quantitative information on strategic importance to society and industry, collaborations, education, etc. from the monitoring reports was included.

2 SRA Monitoring reports (in Swedish) can be downloaded from

http://www.vr.se/amnesomraden/amnesomraden/strategiskaforskningsomraden/arligauppfoljningar.4.7e727b6e141e9ed702b12fb2.html

4) Self-evaluations of the SRA research environments

Self–evaluations were collected from the research environments during March–May 2014. The self-evaluation focused on open-ended and process oriented questions covering the five dimensions of the evaluation. (The Self-evaluation questionnaire can be found in Appendix 5)

5) Bibliometric analysis

The analysis was been based on publication data obtained from the lists of scientific peer-reviewed publications in refereed journals listed by the research environments in the monitoring reports 2010–2013. The analysis includes all reported publications indexed in the Web of Science between publication years 2010–2013. The research environments were asked to complete their publication lists with Accession Number from the Web of Science. All bibliometric statistics were compiled using the publication database at the Swedish Research Council. Humanities, social sciences and engineering sciences are underrepresented in this database due to the lack of coverage of books, book chapters and proceedings.

6) Self-evaluations of the SRA host universities

Self-evaluations from the university management of host universities were collected March–June 2014. (The self-evaluation questionnaire can be found in Appendix 6)

7) Interviews with university and SRA leadership

The expert panel conducted hearings with represenatives of the leadership of each university and research environment representatives in Stockholm in the first week of December 2014.

1.4.2 Evaluation Process

During May–August 2014, each research environment was assessed by two external reviewers using the data sources 1-5 listed above. Each environment was evaluated on their own merits from their individual starting point. In order to select the best reviewers for each SRA, the recruited external reviwers were asked to rank their expertise in relationship to the research of the SRA environments. Two most suitable reviewers were assigned to each SRA. They first conducted an individual assessment of their assigned SRA with criteria and grades for different themes in the five dimensions (see Appendix 7).

The research environments were not compared to each other by the external reviewers. Instead, their own journey towards producingresearch at the international forefront was in focus for their assessments. Five dimensions were assessed by the external experts:

• Research Output (publication profile and scientific impact). Grades used: Not convincing so far, reaching international standards or on the frontline.

• Utilisation and Benefits (capacity to transfer research results, stakeholder engagement in problem formulation, impact on society and business, capacity to provide qualified personnel or research based knowledge). Grades used: Not developed satisfactorily, developed satisfactorily or developed with great satisfaction.

• Collaboration (collaboration between co-applicant universities, collaboration with other SRAs, international collaboration, strategic collaboration outside of academia). Grades used: Not effective so far, effective in several dimensions or effective in all dimensions.

• Integration with Education (the integration of the research environment with different levels of education).

• Management (management of research environment, use of recruitment relative to the goals and intentions of the environment, management capacity as regards of societal needs). Grades used: Not convincing so far, on target and developing with high standards or moving beyond set goals.

The two external reviewers assigned to each research environment co-authored an Evaluation Report for the research environment in question grading it (see above) for each dimension (the dimensions are not weighted towards each other). The reports also include short description motivating their assessment. Thus, this initial step resulted in 43 assessment reports, one for each research environment in which the external experts

evaluated the present status of the research environments and stated to what degree they had reached their goals (see Appendix 4).

The expert panel met or held telephone/skype meetings with the project group of the agencies in the spring and in the autumn of 2014 to prepare for the evaluation process. During the autumn of 2014 the expert panel received all of the evaluation data (see above) 3, and held interviews with the host universities and their

individual SRAs during the first week of December 2014. The expert panel’s evaluation (Chapter 2) focused on assessing the outputs and added value of the 43 SRAs in the light of the government goals for this funding initiative and strategic priorities made by the HEIs. They considered the strategic management and use of the SRA funding to conclude if and in what way the SRA initiative as such has provided added value to the research system in Sweden. The panel was specifically asked to address the question of wheteher the results in the SRA environments can be attributed to HEI strategies and the management of the SRAs.

To facilitate their analysis the Expert Panel used used an assessment protocol (Appendix 8):

1) SRA Research Environment protocols were used to support the panels’ preparations for the hearing of SRA representatives but also to serve as the starting material for drafting a report. Before each interview, clarifying questions were written into the protocol, and answers to the questions recorded, together with general impressions received during the interviews. After the interview, the panel completed the protocol for each HEI and provided a grading for each criterion (Inadequate, Good or Excellent) with arguments based on all the data available for the evaluation.

2) Representatives of each SRA were interviewed with essentially the same questions as for the HEI. The answers and general impressions were then summarised in the evaluation protocols as decribed above for the HEI leadership using the same grading.

3 The expert panel also had access to all of the data used by the external reviewers.