The Faculty's research evaluation - Quality and Renewal (KoF24)

Panel members

Here you find more information about the panel members for each panel.

Self-evaluations

All documents needed for the self-evaluation process can be downloaded at the SharePoint website.

  • Research programme self-evaluation documents (v4)
  • Department self-evaluation documents (v3)
  • “Base data information document" (provides information on how and when the data was collected and how it can be used to support you in completing your self-evaluations)
  • Documents from ÖB19 and how to receive the self-evaluations from KoF17
  • Presentation of KoF and ÖB
  • "Base data and Analysis Graphs" Excel document

Updated research programme and department self-evaluations

An updated version of the programmes (v4) and departments (v3) self-evaluation documents with financial data for 2023 can now be downloaded at the SharePoint website.

Note: In the self-evaluation document, it is now possible to add questions to the panel. Please read the instructions before adding your questions.

 

The departments send the self-evaluations of the department and the research programmes to the office for Science and Technology.

Bibliometrics

Q: We have difficulties identifying the “lead authors” for some of our publications, moreover it takes time to go through all individual publications that the program has produced 2017-2022. How shall we proceed? We also have a lot of different frequent publication channels, shall we list all of them?

A: We are aware that publishing practices differ between fields, in particular if author order is used to indicate different level of contribution. Note that “lead author” is supposed to capture that the primary driver of the intellectual work behind the publication was from the research program, so not necessarily the corresponding author(s) of the publication, although that is many times the case. If there is a problem identifying lead authors for a set of publications, then provide an estimate for it.

For the list of up to ten (but not more) most frequent publication channels in 3.3.4, it will be too time-consuming for many programs to give an exact number of lead authors. Thus it is fine to give an estimation.

Q: You have changed the table with bibliometric statistics. Why?

A: We decided to put focus on the impact indicators and to highlight coverage in relation to the total number of publication (fractions) aggregated over the time interval considered.

Q: Why do you rely on a new Coverage (fractionalized) indicator, and not the usual Web of science-coverage (WoS-coverage) indicator reported in UU ABM 2023?

A: First of all, we have the data for WoS-coverage if anyone is interested. But, it has been apparent to us that misunderstandings can occur when we are talking about citation analysis in the CWTS Leiden version of WoS, without actually explaining exactly what subset of WoS-indexed publications that is included.

So, let A be a program/department. In order to highlight the proportion of A’s publication fractions that are used in the citation analysis, we instead use P_frac_wos/P_frac_total over the years (2017-2021), where P_frac_wos is the publication fractions from A that are included by CWTS Leiden in the citation analysis, and with P_frac_total being the total (fractionalized) publication output in DiVA for A. We also give the complete list of publications used by CWTS Leiden, to help A reflect on given statistics.

Q: Shall a research program consider all publications made by a researcher during 2017-2022?

A: No, only consider publications affiliated with the research program or UU (as given on the original publication). The focus is on publications, not individual researchers. Please note that this may lead to discrepancies between program statistics and department statistics, due to program statistics being calculated from Akka-ids and department statistics being calculated as in UU ABM2023. The program may highlight if there is any significant instance of this problem.

Q: Shall a research program consider all citations made to the program's publications?

A: Please note that when we talk of citation impact in KoF, we are talking of it in the context of CWTS Leiden's version of Web of Science and not in the context of any other citation database. We are also only considering citations made to publications from the years 2017-2021.

Q: Can you change the mapping of persons (AKKAids) to research programs determined by heads of department and generate new bibliometric indicator values? Can we add publications from 2017-2022 that we do not find in the set of included publications?

A: No, practically not. The calculations are made on the fixed amount of publications that existed in DiVA at the time of extraction of metadata, and on the allocation of people (AKKAids) to research programs made by heads of department in 2023. To generate new bibliometric indicator values, a new allocation from all heads of department, a new extraction of metadata from GLIS/DiVA, and in addition a new order of analysis from CWTS Leiden via the Planning Division at UU, where the latter has recently lost the staff competence that carried out the current analysis. We believe that this requires too many resources in relation to the expected benefits. We encourage all programs to reflect on what they see as shortcomings or discrepancies in the reported statistics.

Q: Where do I find details on the Bibliometrics Data presented, terminology used, and the work done to produce the data?

A: First, see the Base data information document for description of sources used and short description of work process. Second, see the Basa Data and Analysis Graphs sheet (to be published later) for the raw data. Third, the Office of Science and Technology can provide additional documentation concerning any details in the work process that you may be interested in. Send your question to teknat-kof@uu.se.

Q: For the bibliometric statistics reported, almost all data are reported as integers, although some indicators are fractionalized. How many decimal points were originally calculated?

A: One decimal point (two for MNCS). See the UU ABM2023 for reference. This must be noted when viewing the PPtop10% indicator. Raw data are available.

Q: Can you provide us with bibliometric statistics on individual years?

A: We can, but we have chosen not to highlight them, due to the relative instability of these statistics.

Q: How do you retrieve the publication and citation statistics?

A: Analogously as in the Annual Bibliometric Monitoring (https://mp.uu.se/web/info/vart-uu/bibliometri/bibliometrirapporten), though the addition of research programs as units of analysis does require an additional preparatory step in the analysis.

Q: Why is (my favorite) bibliometric metric/source not being used instead?

A: There is no perfect way to do bibliometrics and we are aware that the method and data we have chosen is neither perfect nor equally consistent across all of the faculty. We have prioritized transparency and validity when choosing the method, indicators and data source used. The source we are using, the CWTS Leiden in-house curated version of Web of Science, is considered to be industry standard in the scientometric community. The process we have chosen is to identify all authors during the time period and assign them to a research program with the help of the departments. Here we are relying on the departments to add correct local user ids (“AKKAids”) to the list, especially for the researchers which we have not been able to already identify as affiliated authors. From this list, we can then identify the publications that were produced by that research program (and their author share). This approach has been chosen based on what we have access to and what we can deliver with sufficient quality. The field-normalized citation statistics should be considered particularly carefully by programs and departments if the coverage of those statistics is low, as described in the instructions. If there are particular issues related to this that the program or department wishes to raise with the panel, they should include them in their self-evaluation.

Financial data

Q: How is Other Internal Research calculated and what does it contain?

FFF+SFO is the amount of FFF and SFO resources allocated. Other Internal Research is the difference between total internal funding used by the program/department and the amount allocated.

The FFF+SFO data is allocated from VP22 and total internal funding, together with co-funding, is allocated from GLIS/Raindance. This means that there is no raw data for Other Internal Research. Other Internal Research can be, for example, ”institutionsresurser”,”tidsbegränsade resurser” and ”särskilda satsningar” (see appendix 2.4, 2.7 and 2.15 in VP22).

Q: What does Top-10 external funders show?

A: Top-10 external funders shows the amount spent on each financier during the year.

Q: How was the data gathered for the absolute income graphs (1.4 and 3.11) and how should it be used?

A: The goal with this data is to separate long-term (FFF+SFO) internal funding from short-term (co-funding and other) internal funding and external research. However, there is no single way to extract those values. To address this, we used the long-term funding from the VP and the other sources from GLIS/Raindance. This means that there are some corner cases (e.g., large transfers of internal resources) which may make the results hard to interpret. See the Base Data Information document for more details. For the purpose of understanding the amount of long-term internal funding in relation to other resources in the program, this data is almost always sufficiently accurate. However, if a program sees a clear issue that they wish to identify, then this can be addressed in the written comments.

Q: Why are we only looking on financial data from 2022 and 2023?

A: We have chosen to look at financial data for just 2022 and 2023 for two reasons: 1) 2020-2021 were affected by COVID and 2) it is unclear what the panels could extrapolate from a 5-year history. However, we understand that individual years can vary significantly, which is why we include two years. If one of those two years was particularly special, programs/departments are encouraged to comment on it in their report to alert the panel to those circumstances.

Q: Why is there a focus on particular external funding sources and only grants of over 3M SEK?

A: The request for the number of basic science grants (that is, grants that are available to all fields in the faculty) is to try and account for the fact that different fields have different amounts of external funding available. The grants included in this list (VR project/starting, ERC, and KAW) have similar acceptance rates across all fields in the faculty. However, space is provided for programs to list other grants as well that may be field-specific. We have further chosen to limit this to grants of 3M SEK or more to make sure that we capture the most important grants and aren't overloaded with information. If there are particular issues related to this that the program or department wishes to raise with the panel, they should include them in their self-evaluation.

Q: What does "start during the period" mean in table 3.10 (grants)?

A: "Start" means the project started paying out money during the period.

Personnel

Q: How do you count “full courses” in question 6.1?

A: The goal of this is to report approximately (not precisely) how much of the courses are taught by members of the program. For example, if members of the program teach all lectures and labs in a course, then it would be 1.0, but if program members only teach one fifth of the course, then the value would be 0.2. Please try to estimate this sufficiently well that the panel gets an accurate view of the program’s key teaching, but do not feel you have to precisely calculate these values.

Q: What is meant by a “course package” in question 6.1?

A: This is an informal term that is intended to help programs to indicate their teaching in a thematic manner without having to list every course. For example, a program may teach “parallel programming” which consists of 4 separate courses.

Q: What time period should the teaching in question 6.1 refer to?

A: The evaluation is for the years 2019-2023, inclusive. The teaching should reflect that period, but if there has been significant changes, please report the teaching at the end of the period. If there have been significant changes since the end of the evaluation, those can be commented on in the text.

Q: Why are staff assigned outside the department not shown in the personnel data?

Education program responsibles and other assignments for the faculty are usually assigned directly to the faculty and are not included when we collect personnel data for the departments and programs. Assignements that are assigned to the faculty can be commented on in the text.

Priority area

Q: Should we submit the same priority area multiple times if they are shared across departments?

A: No, try to coordinate and submit one that specifies what the collaborations are.

Q: Are collaborations across faculties encouraged?

A: Yes, focus on what will strengthen research in the program/department. But, we probably will not be enthusiastic about giving FFF:s to other faculties. Short-term co-funding could be very interesting, though.

Q: Do proposals have to be for FFF:s? (permanent)

A: No, short-term ones are great as long as they improve research. Remember our goal is to not go over 50% in FFF:s, so non-FFF proposals might help.

Q: Is it realistic to assume permanent (FFF) resources will be allocated to these plans or will it just be short-term funding?

A: Some of the funds may be allocated as permanent (FFF) and some as time-limited. The specific breakdown will depend on the proposals received and the forthcoming decision of the Faculty Board.

Q: Why doesn’t each department get to send more priorities to the faculty?

A: This is a tradeoff in fairness (number of priorities reflects the size of the unit) and practicality (not too many for the faculty to prioritize). This issue is present in several sections, and the specific numbers were decided by FB.

The best way to work with this is to discuss with the section dean, but it is important to keep in mind that only a few of the priorities that make it to the faculty will be funded at all. This means that the majority of the value of this process is in the priorities can be implemented locally.

Other questions

Q: What role will the panels play in prioritizing funding?

A: All funding prioritization (ÖB) will be handled by the faculty through the regular work of Forskningsberedningen and the Faculty Board. The proposals that will be considered for possible funding will be first prioritized by the individual research programs, then the departments, and then between departments in a section before being considered at the faculty-level. This bottom-up prioritization is designed to make sure that prioritization happens first by people with the appropriate subject-domain expertise. The panels' input on how strong research programs are, and their specific comments on how proposals can be improved, will be taken into account in the final prioritization done at the faculty level.

Q: Why is the research program being asked about something that is handled at the department level?

A: The questions are intended to help us reflect on how we work and where we can improve. However, if a question is not relevant for a research program because the program does not participate in that activity and/or it is completely addressed by the department, then it is completely appropriate to use an explanation provided by the department and make that clear in the response text. For example, if no one in the research program provides career support/mentoring/etc. to Assistant professors and all of that is handled by centralized departmental resources, then it would be appropriate to refer directly to the department. However, if the PAP or other members of the program provide such support, then the program should include that in its reflection.

Q: Why are programs asked to report "infrastrucure used" while departments are asked to report "infrastructure supported"?

A: The programs should give a prioritized list of the most important infrastructure they use, which may include infrastructures they do not explicitly pay for (e.g., international ones), while the department should not prioritize them, but rather list where it puts the most money.

Q: Why do the research programs have to submit their reports to the departments a month earlier?

A: The departments have the responsibility to prioritize among their research programs' proposals and their own. For them to be able to do this, they need the programs' proposals ahead of their own deadline. Further, much of the departments' own reflections will include input from their research programs.

Q: Why do we have to submit our report in Word format?

A: We are using Word to make sure that everyone has a consistent form and that we can automatically extract the answer text for each form to a single database. Please be careful to not change anything in the document other than to fill in the tables where your answers are requested, as doing so may make it difficult to automatically extract your answers.

Q: Why does the Word meny appear in various languages when the self-evaluations are opened in SharePoint using Microsoft Word Online?

All collaborative workspaces can be displayed in both Swedish and English, and are automatically adjusted depending on your browser's language setting. If you want to display the collaborative workspace in English, it is recommended to change the language setting in your browser to English US. Other types of English, such as British English, have been found to produce incorrect language in Office Online. In other words, you can see anything from Finnish to Greek when you open a document in your browser if you have the British English language setting in your browser.

How to change the language in your browser:

Microsoft Edge

Mozilla Firefox

Chrome

The faculty evaluation will include the following four aspects:

  1. Development and renewal of research and research environments
  2. Collaboration with the surrounding community
  3. Recruitment, career paths and career support
  4. Links between research and education.

The Faculty of Science and Technology will review the distribution of the core research program funding (ÖB) together with the KoF evaluation.

The desired result of KoF/ÖB24 is long-term improvement across the faculty in four areas:

  • The quality of research, through self-reflection, prioritization and external feedback
  • Collegial culture, through broad participation in discussions in research programmes and at departments
  • Understanding of how the faculty functions, through the collection of data about how we work and the sharing of good examples
  • Use of resources, through prioritisation of plans at programme- and institution-level and support based on plans and self-reflection

This will be achieved primarily through self-reflection and external feedback on our strengths and weaknesses as well as the development of plans and priorities for the next five-year period. Decision on schedule, purpose and structure by the Faculty Board 17 October (decision in Swedish).

  • External panels will read our self-evaluations, visit to discuss how we work, and then provide us with a final written report with recommendations for how we can improve
  • Self-evaluations from both departments and research programmes will be used as a basis.
  • Basic data (personnel, finances, postgraduate education, use of resources, etc.) and bibliometrics will be assembled centrally (number of publications, citations, PPtop10, etc.) for each department and research program.

For the parts concerning Collaboration with the surrounding society, Recruitment, career paths and career support, Connections between research and education, the evaluation will focus on our processes and support rather than evaluating the specific activity.

October-December 2023

2024

  • Departments and research programmes fill out their self-evaluations, and have access to basic data. Tentative deadline 15 April for the research programmes and 15 May for the departments.
  • Decision on instructions to panels, by the Faculty Board in June.
  • Site visits from panels week 40 (30 September - 4 October).
  • Reports from panels are expected during October/November.
  • The self-evaluations, base data and panel reports will be evaluated by the Faculty research committee, with input from the Faculty education and collaboration committees.

2025

  • Progress report to the Faculty Board. Decision on possible supplementary directives (February).
  • Final report to the Vice-Chancellor (February-April).
  • Presentation of evaluation at the Vice-chancellor's evaluation conference.
  • Preparation of the 2026 Faculty Financial Plan (VP) by the Faculty research committee (May - August)
  • Final decision on changes in allocation of core research funding and continued reviews by the Faculty Board (September)

FOLLOW UPPSALA UNIVERSITY ON

facebook
instagram
twitter
youtube
linkedin