Final Report

Final Report

(For projects funded by the Learning and Teaching Enhancement and Innovation Fund 2018-19)

 

  Key contact details
Author(s): Dr Suraj Ajit
Job title(s): Senior Lecturer/Programme Leader (Computing)
Faculty: FAST
Email(s): suraj.ajit@northampton.ac.uk
Date submitted: 30/06/2019

 

  1. Project title

Implementation of a fair group marking, and student scoring scheme based             upon separate product and process assessment responsibilities.

 

  1. Project aims and objectives

Please use the table below to provide information on the intended aims and objectives of your project (the ones stated in your project proposal) and the aims and objectives that have been achieved.

 

Intended aims and objectives Achieved in full? Comments
To involve students as partners in the assessment of group work (or collaborative learning) in a fair and consistent manner

 

Yes  The proposed model/tool was deployed in two modules – Software Engineering 1 (Year 1) and Group Project (Tear 2). This involved both students and staff.
To improve student (and staff) satisfaction by adopting a new approach to group work assessment

 

Yes The staff and students who used the model and tool in the two modules have given positive feedback and are happy with the new approach.
Recommendations/roadmap to inform institutional policy on assessing group work

 

Yes  A demonstration of the model and tool was given to Dr. Rachel Maxwell (Head of Learning and Teaching).  A meeting was arranged with her to discuss and finalise the roadmap.

 

  1. Project outputs and deliverables

Please use the table below to provide information on your project’s outputs and deliverables (the ones stated in your project proposal) and the outputs and deliverables that have actually been achieved.

 

Intended outputs and deliverables Achieved in full? Comments
A technical report detailing all the work done. In particular, this report would include details on key findings from staff and student interviews, evaluation of current approaches to assessing group work with the university, feasibility(pilot) study of using a novel approach to group work assessment, roadmap/recommendations to inform institutional policy on assessing group work

 

  Yes

 

This report provides an overview of the work done including key findings. However, it could be expanded to provide a more detailed report.
A draft for submission at a pedagogical conference/journal (AHE 2019/PRHE Journal)

 

 Yes  Two submissions were made to two peer-reviewed international conferences (AHE 2019 and Advance HE Teaching and Learning Conference 2019). Both were accepted. The work was presented at AHE 2019 last week.
A blog, detailing the progress/results of each phase of the project

 

 Yes The details/progress of each phase of the project have been provided at https://mypad.northampton.ac.uk/groupmarking/

 

  1. Project evaluation

Please use this space to provide information on the methods that you used for carrying out an evaluation of the project, and the key findings and results from the evaluation.

 

The evaluation was carried out in two phases:

 

Phase1: The Level 4 module Software Engineering1 (CSY1019) has group work as the assessment for Term 1. The students work collaboratively on the assignment tasks and submit a group report. The report is marked by the tutor to allocate a group grade using standard rubrics on NILE. This corresponds to the product score, as per our model.  A peer assessment system was then set up on NILE, where each student can rate other members (not including self) in their respective group. The NILE peer assessment system was activated only after the group report was submitted by the students. The format of the peer assessment system was as follows (table):

 

Following the report submission deadline, an announcement about the peer assessment was made to the students on the module. The students were given a specified date before which they had to complete the peer assessment. The module consisted of around 180 students and they were split into 18 groups with 10 members in each group. Following the peer assessment, the data was exported into an Excel spreadsheet. The module tutor then used this data together with the group grade to calculate individual grades. The tutor used a peer adjustment factor based on his own observations/notes to calculate individual grades. We imported all the group information, peer assessment and group grade data into GPM (proposed tool) version 1.  We then used GPM (version 1) to calculate individual grades. We then did a comparative analysis of module tutor’s individual grades with that of individual grades produced by GPM (version 1). We found that there were marginal differences but no significant ones. We then sent the comparative analysis to the module tutor and arranged an interview to discuss about the results produced by GPM (version 1).

 

Key findings:

  • The tutor was happy with the grades produced by GPM (version 1). However, the tutor did not agree with the need to conform to split-join principle.
  • The tutor informed that he would like the flexibility to adjust individual grades. He is happy to use the grades used by GPM (version 1) as the base grades.
  • The tutor informed that he would definitely like to use GPM in the future, as he believes it would be a great asset. He informed that he believes it would save time and improve efficiency in the long term.
  • The suggested a few features that we would like added to the tool. These were minor features related to import/export feature, inclusion of student names, etc.
  • He particularly liked the zoom factor feature that can let him amplify the effect of peer assessment on the grade calculation.

Phase2: Following phase1, several improvements/changes were made to the tool. A new version of the tool, GPM (version 2) was released. A presentation was given to the computing staff members. The Level 5 Group Project module for Computing (Network Engineering) was now used for evaluation. This module consisted around 20 students.  The module tutor met the tool software developer to get a walkthrough session about using the tool. The tutor input the student details and formed groups. The report submitted by each group was marked using rubrics on NILE to get the group grade. Peer assessment (on this occasion) was done using GPM (version 2). A unique link generated by GPM for each group was sent to the group members for the peer assessment. GPM was then used to calculate individual grades.

Key findings:

  • The tutor was very impressed with GPM (version 2). He said that it is easier, efficient and more accurate compared to his current process.
  • He informed us that he received positive feedback from the students regarding the use of GPM. There were no issues or complaints from students.
  • He informed us that he would like to be able to easily reallocate a student to the right group, in case, he made an error on group allocation in the first iteration.

 

  1. Project impact

Please use the table below to provide information on the intended impact and benefit of your project (the ones stated in your project proposal) and the impact and benefit that have been achieved.

 

Intended impact and benefit Achieved in full? Comments
Staff: There is no tool that combines lecturer score and peer (student) score appropriately. Staff currently do it using spreadsheets and find it tedious, time consuming and error-prone.    Yes The tool developed using this project eliminates these issues. Two module leaders used them and provided extremely positive feedback. They intend to use it again.
Students: The student learning experience may be improved by involving the students in the (peer) assessment process of group work. Students can learn by observing performance of peers and on the different evaluation strategies.    Yes There we re no student complaints or issues after using the proposed model/tool in two modules. Students found the process fair and transparent.
The project has the potential to change practice and inform institutional policy/strategy on assessment of group work.  Partly (In progress) A roadmap has been developed following meeting with the Head of Learning and Teaching. To put into practice, more evidence of widespread usage, both within computing and other subject areas in the faculty needs to be gathered.

 

  1. Dissemination activities

Please use the table below to provide information on the dissemination activities that have been conducted and their impact.

Dissemination activities Impact
A presentation of the project was given to all computing staff at our regular staff meeting on 27th March 2019 Five module leaders expressed interest in trying out the proposed model/tool for their modules.
International Assessment in Higher Education Conference 2019, Manchester A presentation was given to a large international audience. Several people engaged and asked questions. It is hoped that a network will be established for further interaction.
Blog The details/progress of each phase of the project have been provided at https://mypad.northampton.ac.uk/groupmarking/
Advance HE Teaching and Learning Conference 2019, Newcastle Yet to Present (Conference is scheduled for next week)

 

 

  1. Budget update

Please use this space to provide an update on your budget, in a suitable format, indicating aspects such as:

  • Project underspend
  • Project overspend
  • Any other relevant aspects in relation to the budget

 

Description Employee Name Total Cost (£)
PAY COSTS (list all staff – one line per employee)    
Graduate Teaching Assistant (£14ph)

1 day = 7hrs

(40 days)

 Andrew Dean £3920
Conferences*  discounted early bird £392
TOTAL SPEND/AMOUNT EXPECTED   £4312
ORIGINAL ALLOCATED COSTS    £4312

 

*Please note that funding/support from the Faculty was used to cover any additional conference expenses.

 

 

  1. Final reflections

Please use this space to add any other comments and reflections on your project, such as lessons learned.

 

The project has been very successful in terms of the aims and objectives it set out to achieve. There are two aspects to this project: one is about the model and another is about the tool. It is possible that one may want to adopt the proposed model but not the proposed tool. This means that staff may want to adopt the model using Excel spreadsheets but not by using the tool. The project has emphasised that there are many issues/problems of group work faced by both staff and students. It is a huge challenge to get someone to change their existing approach of assessing group work that one may have been adopting for many years. People are generally reluctant to invest any time to learn to use a new model or tool. Although five module leaders expressed interest in adopting/evaluating the proposed model and tool, only two did that.  It is important for the subject leader to further support the proposed model/tool by emphasising other module leaders to adopt/evaluate it. It is also important to gather further evidence to appropriately criticise some of the other approaches adopted by module leaders. Further work is needed to extend both the model and tool to make it flexible enough to accommodate some slightly different approaches but conforming to one underlying framework. The next step would be to get one big subject area such as computing to adopt a unified framework. This should then be expanded to other subject areas within the faculty. The Head of Learning and Teaching has been impressed with the project results so far. She is happy for this work to influence the existing institutional policy on assessing group work when enough evidence (including more use cases) has been gathered within the Faculty to support it.

 

 

Please submit this final report to Ming.Nie@northampton.ac.uk no later than 30th June 2019. Please also make your final report available on your project blog.

 

We’ll need a couple of weeks to assess your final report and proceed with the release of the final 50% of the fund to your Faculty before the end of the University’s current financial year. Thank you for your cooperation!