Overview of the UMF Review Evaluation

During 2017-18, all UG programmes at the University were aligned with the provisions of a revised University Modular  Framework designed to enable increased innovation in assessment practice and reduce the assessment burden on students through a rationalisation of learning outcomes and a clear focus on constructive alignment. The purpose of this project is to evaluate the impact of this university-wide project on assessment practice with a view to gathering meaningful data that can contribute to the University’s next TEF submission.

The UMF Review project produced highly practical outcomes in terms of revisions to programme and module learning outcomes and to assessment practice, including changes to marking practice. The next stage is to robustly evaluate the impact of those changes as against the underpinning principles of the Review and the need for the University to demonstrate meaningful, scalable change that will impact on NSS results and other metrics considered in the TEF.

Project Aims and Objectives

Aims:

  • To evaluate the impact on assessment practice resulting from the University Review of Assessment
  • To evaluate staff perceptions on the value and benefits of the Review

Objectives:

  • Undertake a survey of academic staff exploring their perceptions on (a) the impact of the UMF Review; and (b) the usefulness of the COGS Learning Outcomes toolkit
  • Explore themes and concepts emerging from the staff survey via a series of focus groups / semi-structured interviews with willing participants who self-identify through the survey tool
  • Compare old and new curriculum documentation to determine how assessment practice has quantitatively changed through the UMF Review

Rationale behind the UMF Review itself

The underpinning rationale for the UMF Review was clearly articulated over its development period and as part of the Senate paper that was approved in July 2017. The new assessment provisions were designed to:

  • Increase academic ownership of assessment practice, to ensure that assessments are based on sound pedagogic principles within the broader ABL framework and clearly linked to subject content and learning outcomes.
  • Remove overly prescriptive rules around assessment volume and weightings, to encourage more innovative assessment design and facilitate assessments more closely aligned to subject content and learning outcomes.
  • Reduce the assessment burden placed on our students, with the aim of increasing student achievement, student retention and progression and student satisfaction.
  • Align our assessment volume with practice in the sector.
  • Rationalise learning outcomes, to ensure clear constructive alignment between content, learning outcomes and assessment and to ensure that learning outcomes remain achievable through reduced volume of assessment.

Evaluation Project Overview

The challenge that we face now is evaluating the impact and effectiveness of the revised programmes and modules to determine whether the stated needs of the project were met by the agreed actions.

In particular, our survey will explore:

  1. Staff perspectives on the role and purpose of learning outcomes
  2. The value staff placed on the COGS learning outcomes toolkit which was designed to support the effective writing of learning outcomes to academic level and also to support the embedding of Changemaker and employability into all programmes and modules. This will include exploring the importance placed by staff upon this initiative
  3. How staff perceive the relationship between learning outcomes and assessments
  4. The extent to which staff believe this project will practically realise the University’s statement of graduate attributes
  5. The impact (if any) on ABL teaching practice as a result of the UMF Review

Survey responses will be coded by theme and then these themes will be further unpicked during focus groups or semi-structured interviews with academic staff who have self-identified as being willing to participate.

Other issues to be explored once the survey has been completed include:

  • Exploring the appropriateness of the new assessment practice (fitness for purpose)
  • The effectiveness of the UMF Review in realising the ChANGE Framework and in delivering the UoN Graduate.

Alongside these aspects of the research project, we will also compare curriculum documentation to see what practical changes to assessments were introduced as part of the Review. The sorts of things we will be considering here include:

  • Comparing assessment instruments across different fields of study
  • Identifying the most popular forms of assessment by discipline and across the board
  • Exploring the relationship between assessment methods and student feedback

The findings of all three elements will be triangulated and turned into a report that can be used by the University to inform future policy and practice initiatives or as supporting evidence in our next TEF submission.

In particular, for the purposes of TEF, we need to evidence:

  • Impact on student satisfaction;
  • impact on student achievement;
  • impact on pass rates, withdrawals and progression rates;
  • tutor workloads;
  • external examiner responses

Outputs and Deliverables

1 x impact evaluation report to the University to inform TEF drawing on the data from the staff survey (phase 1) and the subsequent focus groups/SSIs (phase 2)

1 x data analysis of curriculum documentation in respect of the UMF Review changes, in respect of Learning Outcomes and ‘levelness’ and also assessment practice

2 x (draft) articles for publication: (a) evaluation of the UMF Review of assessment; and (b) the next in a series of articles about ‘Changemaker in the Curriculum’ with a particular focus on the value and usefulness of the COGS learning outcomes toolkit in embedding Changemaker into the Curriculum.

Evaluation of the Project

As this project is, of itself, an evaluation of a project that was undertaken prior to this bid, the success and impact of the UMF Review evaluation will be measured by considering whether the outputs and deliverables (above) were produced by the 30 June 2019 deadline. It should be noted that the articles themselves may not have been accepted for publication by that date given that this can be a lengthy process.

More broadly, the evaluation should show the success and impact of the UMF Review project which will be of more use to the University.

Intended Benefit and Impact

The project that this bid will evaluate has already sought to improve the student learning experience.

This project will change practice as follows:

  1. The introduction of a policy requirement to mark student work explicitly on the extent to which it addresses the requirements of the learning outcomes
  2. In how staff write their assessment briefs and articulate the expectations on students

This project has already informed significant policy changes. The purpose of this evaluation is to evaluate the effectiveness of these changes on the student learning experience particularly as articulated in the NSS and other TEF-related metrics.

Dissemination of Project Outputs

The outputs from this research project will be:

  • Shared with the University via Committee papers and an internal report on the success and impact of the changes
  • Disseminated externally via 2 publications: (a) evaluation of the UMF Review of assessment; and (b) the next in a series of articles about ‘Changemaker in the Curriculum’ with a particular focus on the value and usefulness of the COGS learning outcomes toolkit in embedding Changemaker into the Curriculum.
  • Shared at an external conference (TBD)

Project Workplan

  • Design and distribution of staff survey: July 2018
  • Staff survey open for completion: July – October 2018
  • Recruit Research Assistant: October 2018
  • Survey analysis: October/November 2018
  • Staff focus groups / semi-structured interviews: November 2018 – January 2019
  • Curriculum documentation analysis: October 2018 – March 2019
  • Triangulation, evaluation and writing: January – June 2019
  • Conference presentation: May – June 2019

Leave a Reply

Your email address will not be published. Required fields are marked *