Course Evaluation Policy FAQs
This document was initiated in August 2015, but is continually under revision – please send any comments or suggestions for additions to Richard Lowdon.
See also EvaSys FAQs
1. What is the scope of the Course Evaluation policy?
The policy has minimal requirements for a course evaluation questionnaire - the five core questions (verbatim) must be asked for each course, covering all university staff who teach at least three lectures on that course. These are the institutional requirements, which are deliberately kept small.
Outside of this, there is a great deal of flexibility: you can decide when to administer the questionnaire, you can decide which question sets need to be added, you can decide if and how to include other members of staff as necessary, you can decide to use other feedback mechanisms. The decisions made within this permitted flexibility are led by academics, not the policy itself – the two provisos being that you must consider evaluation fatigue, and that the focus is on enhancement, not management.
2. Given the extensive flexibility, what is the best approach?
Our current advice to all schools is, at the moment, to only implement the minimal requirements as much as is possible within their context. The outcome from doing so will inform what information actually needs to be collected in subsequent semesters (not simply the information that has always been gathered in the past).
A guiding principle is that the questionnaire should focus on specific purposes; that is, not: “it would be interesting to see..” but “we need this information because…” – if we focus on the information that is needed, then we can keep the questionnaires short and focussed, which will result in higher response rates, and richer qualitative feedback.
3. Who decides which additional questions should be added to the questionnaire?
Questionnaire design is the responsibility of the lecturer/course team. Questions additional to the CORE set should not be added without prior discussion with, and opt-in agreement from, the lecturer/course team.
3.1 Really? Does each course questionnaire really have to be individually tailored to each lecturer’s particular needs?
If the advice given in 2 above is followed, then the administrative load in setting up the questionnaires will be minimised, as only in particular circumstances (e.g. a new course, evidence of teaching excellence) will the questionnaire need to be customised for a particular lecturer.
Some schools may wish to add a set of questions to the questionnaires for all courses in the school – it is advisable only to do so when there is a clear reason as to why the information is required (see 2 above). The policy permits this on an ‘opt in’ basis: that is, each lecturer needs to know in advance that these additional questions will be added to the questionnaire for their course (and must agree to this).
The Working Group recommended this approach so as to reduce the extent to which EvaSys is seen as a management and assessment tool, and to emphasise the use of questionnaires as a means for an individual member of staff to gather feedback for the purposes of enhancement.
4. This policy means that we will have to do things very differently: why can’t we do things the way we have always done it?
A university-wide policy for course evaluation is necessary for the purposes of consistency and auditability, and to ensure that, where appropriate, excellent practice is rewarded and remedial support offered. The Working Group deliberately made the institutional requirements minimal (see 1 above), while permitting flexibility to accommodate different patterns of teaching.
Importantly, implementing the questionnaire should not simply replace existing effective feedback mechanisms; other forms of evaluation are encouraged and should be carried out in addition to the EvaSys evaluation.
5. Which courses does the Course Evaluation policy cover?
The policy is for credit bearing courses only, and only covers the design and use of the course evaluation questionnaire. Other methods for gathering feedback can also be used (and, indeed, are encouraged).
The EvaSys system can be used for non-credit courses and for other purposes apart from course evaluation, independent of the policy.
6. Why can't we just 'tweak' the CORE requirements of the policy for our own purposes?
The CORE questions are intended to give a consistent overview of all courses in the university, and these requirements are deliberately kept minimal. Outside of these CORE requirements, the policy is highly flexible.
Our aim is to implement the CORE minimal requirements 'as is' as much as possible - only once we have tried things out will we know whether they will work across the university or not: if any aspect of the policy is shown not to work, then we can feed this information into the policy review process, so that the policy can be amended accordingly. But we have to try things out first!
7. Is it better to administer the questionnaire on paper or on-line?
In general, paper format is currently preferred by most Subjects due to the higher response rate, as paper surveys tended to be distributed and collected during class.
However, on-line surveys have the benefit that they do not take up class time and automatic reminders can be sent to students who had not yet completed the questionnaire (although there is some concern that students might receive so many of such reminders that they are ignored). Other universities report that the response rate to online surveys increases over time, as students become more used to this means of gathering data.
EvaSys allows for questionnaires to be administered online or on paper (but not both for the same course). A scanner is required for paper surveys, and there is additional administration required in scanning the documents.
8. How can I improve response rates for my online surveys?
Please consult the 'Improving Response Rates for Online Surveys' document. This document outlines some of the strategies that Schools have used to increase online response rates.
9. What allowance does the policy make for situations where there is more than one member of staff teaching on a course?
The first CORE (teaching) question should be repeated for each university teaching staff member of the course teaching team who has delivered at least three classes.
This requirement does not cover GTAs or non-university teachers – feedback may, of course, be gathered for these non-university staff teachers, but this does not fall within the requirements of the policy.
10. There are only five CORE questions – is this sufficient?
The five core questions are considered sufficient for routine, general use, as a basic quality benchmark for a course.
Other question sets can be added for particular purposes (for example the Teaching Quality set for more detailed information on individual lecturers or the Course Quality set for new or recently changed courses), but the five core questions are intended to stand alone for general, routine use.
11. How were the CORE questions chosen?
The Course Feedback Working Group gave substantial attention to all the questions chosen: the CORE questions as well as the questions in the various question sets. The group reviewed a large number of questions from a wide range of existing surveys, and found that many questions were repetitive, misleading, ambiguous or unclear.
The teaching question is phrased in a way that ensures that it focusses on the ability of teachers to explain the material for the purposes of student learning, preventing teachers being evaluated with respect to their popularity or entertainment value. The course question focusses on what the Group considered is essential for assessing academic quality. The ‘overall’ question mirrors the NSS. The two open CORE questions are standard formative questions that the Group believed are essential for both recognising good quality, and for ensuring continual improvement.
12. How are questions scaled in EvaSys?
All EvaSys questions are scaled 1 to 5 (from left to right), regardless of the wording. For example, it does not matter if 'Strongly Agree' or 'Strongly Disagree' is on the left-hand side of the questionnaire, the left-hand statement will always be ascribed a value of 1.
All our questions have 'Strongly Agree' on the left-hand side, so as to match the National Student Survey. This means that we must interpret low numbers as having high agreement.
13. Who gets to see the results of a course questionnaire?
The policy states that the results of a course questionnaire (typically produced in a pdf document by EvaSys) should only be made available to the lecturer/course team for the course, and their line managers (that is, those in a positon to affect change and to influence the individual’s subsequent behaviour by offering additional support, praise, encouragement for promotion, etc.)
The EvaSys configuration will ensure that the report is automatically emailed to the lecturer/course team; the means by which line managers obtain the documents can be agreed at school level – typically they will be made available so that they are used within the P&DR process.
Members of staff are encouraged to write a narrative reflecting on the questionnaire results to ensure accurate contextual interpretation of data – EvaSys provides a mechanism for a note to associated with each evaluation report.
12.1 Is that all? Can no-one else see the results from a course questionnaire?
Individual members of staff can share their results with anyone that they wish to pass them on to. So, in those schools which already have in place a policy where all evaluations are made public, the convenors of the L&T committee should request that all members of staff forward their result reports to them.
If approved by the Head of School, access may also be given to the director of the programme that the course is associated with if he/she considers this necessary.
12.2 Why can’t the quantitative data be used to compare the performance of all the courses in the school?
The Course Feedback Working Group felt that the production of ranked league tables of courses or lecturers could result in less willingness from members of staff to engage in the evaluation and feedback process, and that the policy and EvaSys would come to be seen as simply a management tool – rather than a useful tool to be used for the purposes of course enhancement.
14. How long should questionnaires be retained and how should they be destroyed?
Paper questionnaires should be shredded immediately after the survey data has been scanned and verified. Prior to shredding, questionnaires must be held in a secure location, accessible only to the School EvaSys Administrator.
15. What data will be aggregated?
The policy states that the quantitative data from the three closed CORE questions (and only the CORE questions) should be aggregated over all the courses in the school, and subsequently over all the Schools in the College. This aggregated data will be made publically available.
4.1 Is that all? Are no other aggregations to be made?
Internally, Schools may create their own aggregations if they like – for example, they may wish to aggregate over subject or discipline.
16. Can I use EvaSys for the collection of research data?
No (at least not at this stage). The data access and privacy restrictions of the Course Evaluation Policy mean that only administrators can have access to the system for the purposes of creating surveys - this means that these administrators already have extensive additional work load that cannot also encompass supporting the collection of research data. This is not to say that in future years this might be considered, but not at this stage. At the moment, our focus has to be the collection of valid and appropriately secure course evaluation data, and the university-wide adoption of a consistent evaluation system.
17. Does this mean that EvaSys can ONLY be used for the implementation of the Course Evaluation Policy?
No: there are other situations where it makes sense to use EvaSys to evaluate our wider educational provision. Other school-based evaluations (for example, evaluation of a new programme, SERs for PSRs, mock-NSS surveys, surveys required by accreditation bodies) can be set up by a School EvaSys administrator. It makes sense to label those questionnaires/surveys that are associated with courses by their course-code, and those that are associated with a wider school investigations with appropriate labels - and clearly the latter are not subject to the use of the policy's CORE questions.
In addition, on request, academic staff can be given Instructor access in the 'Scholarship' unit to enable gathering data for scholarship purposes - on the assumption that appropriate college ethics processes have been followed.
18. What is the best time to distribute questionnaires?
This is dependent on the type of course, and can be decided at school level. Most questionnaires would likely be distributed nearing the end of the course. However, it is recognised that some courses are taught in teaching blocks and for such courses the School might consider distributing the questionnaire after each block.
19. What is best practice for providing feedback to students?
A summary of the feedback should be provided to students, together with information on what action is being undertaken as a consequence. This is not the raw data produced in the EvaSys reports, but a verbal summary of the responses, highlighting the most common issues arising from the comments. A simple feedback template is provided in the policy document.
Schools will need to put processes in place to facilitate the production of this summary, according to their own local practises. Typically, the summary will be produced by the course co-ordinator, but administrators may be asked to assist.
This element of the policy increases workload, but is considered a vital part of the evaluation process indicating to students that their responses are taken seriously. Schools will need to establish a means of ensuring the system is monitored and is effective in closing the feedback loop.
The information should be made available to both incoming and outgoing students from the course, as far as is possible. Moodle, SSLC, Student Voice are suggested ways of providing the feedback.
20. How is information from the open questions collated?
It is the responsibility of the course lecturer (or course team) to collate and summarise the responses to the open questions for their course, identifying the common themes. Some schools may offer academic staff some administrative support in this task.
21. Will course evaluation feed into Annual Monitoring?
The policy has not been articulated with annual monitoring but there is potential for it to do so in the future.
22. Why is ‘strongly agree’ on the left of the scale? I expect the scale to go the other way, with ‘strongly disagree’ on the left.
We have used this ordering because it is the one used in the National Students’ Survey.