Final Report 2019-20

Final Report

(To be completed by projects funded by the Learning and Teaching Enhancement and Innovation Fund 2019-20)


  Key contact details
Author(s): Dr Suraj Ajit
Job title(s): Senior Lecturer/Programme Leader (Computing)
Faculty: FAST
Date submitted: 30/06/2020


  1. Project title

Implementation of a fair group marking, and student scoring scheme based upon separate product and process assessment responsibilities



  1. Project aims and objectives

Please use the table below to provide information on the intended aims and objectives of your project (the ones stated in your project proposal) and the aims and objectives that have been achieved.


Intended aims and objectives Achieved in full? Comments
To involve students as partners in the assessment of group work (or collaborative learning) in a fair and consistent manner


Yes The tool/model was deployed in following modules: CSY1019- Software Engineering1, CSY2027 (Group Project – Computing), CSY2027 (Group Project – Networking), CSY1024 (Games Techniques), CSYM028 (Modern Computer Architecture).
To improve student (and staff) satisfaction by adopting a new approach to group work assessment


Yes The staff and students who used the model and tool have given positive feedback and are happy with the new approach.
To inform institutional policy on assessing group work


Yes The positive results from this project will go a long way in informing the institutional policy on assessing group work (initially in the subject area of computing)
  1. Project outputs and deliverables

Please use the table below to provide information on your project’s outputs and deliverables (the ones stated in your project proposal) and the outputs and deliverables that have actually been achieved.


Intended outputs and deliverables Achieved in full? Comments
A web-based tool for assessing group work


Yes A fully functional tool has been developed and deployed in several modules within computing.
A technical report detailing all the work done.


Yes This report gives an overview of the work carried out. However, additional detailed reports are available for specific aspects of the project (e.g. functional specification, user guide)
A draft for submission at a pedagogical conference/journal (AHE 2020/PRHE Journal)


Yes AHE 2020 conference was deferred due to COVID -19. Instead,  proposals were submitted to Advance HE Learning and Teaching Conference and IEEE Conference on Software Engineering Education and Training. Both were accepted.
Tool demonstration/showcase and dissemination of findings at UoN L&T Conference


Yes (in other conferences) Dissemination due as follows:

7 – 9 July 2020 – Advance HE Conference, UK [Recorded Talk/Demo].

9-12 November 2020 – IEEE CSEE&T Conference, Munich, Germany



  1. Project evaluation

Please use this space to provide information on the methods that you used for carrying out an evaluation of the project, and the key findings and results from the evaluation.


The evaluation was carried out in two phases:


Phase1: The Level 4 module Software Engineering1 (CSY1019) has group work as the assessment for Term 1. The students work collaboratively on the assignment tasks and submit a group report. The report is marked by the tutor to allocate a group grade using standard rubrics on NILE. This corresponds to the product score, as per our model.  A csv file containing a list of groups and students was created by the module tutor.  This file was imported to GPM (our tool). A peer assessment system was then set up on GPM, where each student can rate other members (not including self) in their respective group. GPM generates a URL for peer assessment.


Following the report submission deadline, an announcement about the peer assessment and its URL was made to the students on the module. The students were given a specified date before which they had to complete the peer assessment. The module consisted of around 120 students and they were split into 12 groups with 10 members in each group. Following the peer assessment, the data was exported into an Excel spreadsheet. The module tutor then used this data together with the group grade to calculate individual grades. The tutor used a peer adjustment factor based on his own observations/notes to calculate individual grades. We imported all the group information, peer assessment and group grade data into GPM (proposed tool) version 2.  We then used GPM (version 2) to calculate individual grades. We then did a comparative analysis of module tutor’s individual grades with that of individual grades produced by GPM (version 2). We found that there were marginal differences but no significant ones. We then sent the comparative analysis to the module tutor and arranged an interview to discuss about the results produced by GPM (version 1).


Key findings:

  • The tutor was happy with the grades produced by GPM (version 2).
  • The tutor informed that he would definitely like to use GPM in the future, as he believes it would be a great asset. He informed that he believes it would save time and improve efficiency in the long term.
  • The tutor agreed to use it for an additional group project module in the next term.

Phase2: Following phase1, all improvements/changes planned in the proposal were implemented. The new version of the tool, GPM (version 3) was released.  The following module leaders of following modules agreed to deploy the tool:

  1. CSY2027 (Group Project – Computing)
  2. CSY2027 (Group Project – Networking)
  3. CSY2027 (Group Project – Software Engineering)
  4. CSY1024 (Games Techniques)
  5. CSYM028 (Modern Computer Architecture).

A video tutorial of GPM was created and distributed to the module leaders. The tutors consulted me for some clarification. They used the tool successfully for all the phases: (1) Import CSV file containing list of students (2) Create and disseminate peer assessment link (3) Calculate group (product) score on NILE and enter into GPM (4) GPM combines peer assessment score with group (product) score to calculate individual student score (5) Student scores can be exported to csv file by GPM.

Key findings:

  • The tutors (1 and 2, in particular) were very impressed with GPM (version 3) and happy with the individual scores generated by GPM. They did not have to use the option to override any grade.
  • One tutor informed that this tool was particularly useful in the COVID-19 situation as face to face meetings with students were not possible.
  • There were no issues or complaints from students.
  • One tutor forgot the password to the account after setting up groups. The university SMTP server blocks access to applications. Hence, GPM cannot send emails to tutors to reset password. The tutor had to recreate his account. In future, to avoid problems, it is planned to introduce an administrative panel with  access rights/privileges defined for each user. An administrator should be able to oversee all other accounts and access data, reset passwords, etc.


  1. Project impact

Please use the table below to provide information on the intended impact and benefit of your project (the ones stated in your project proposal) and the impact and benefit that have been achieved.


Intended impact and benefit Achieved in full? Comments
Staff: There is no tool that combines lecturer score and peer (student) score appropriately. Staff currently do it using spreadsheets and find it tedious, time consuming and error-prone. Yes The improved tool developed using this project eliminates these issues. Several module leaders used them and provided extremely positive feedback. They intend to use it again.
Students: The student learning experience may be improved by involving the students in the (peer) assessment process of group work. Students can learn by observing performance of peers and on the different evaluation strategies. Yes There were no student complaints or issues after using the proposed model/tool. Students found the process fair and transparent.   
The project has the potential to change practice and inform institutional policy/strategy on assessment of group work On progress Evidence of GPM’s usage in increasing number of modules within computing has been obtained over the last two years. The next step would be to expand it to other subject areas.


  1. Dissemination activities

Please use the table below to provide information on the dissemination activities that have been conducted and their impact.

Dissemination activities Impact
A virtual presentation (30 min recorded talk and demo) is to be given at the Advance HE Conference, 7-9 July 2020 This is expected to contain a large international audience. It is hoped that other universities will express interest in using the model/tool.
Presentation and full research paper publication at the prestigious IEEE Conference on Software Engineering Education and Training, to be held in Munich, Germany from 9-12 November 2020 This international conference is highly reputed and the acceptance process was very competitive with an acceptance rate of 37% for full research papers. It is hoped that this publication and presentation would help gain traction among a wider audience.
Blog The university blog has been updated with the details/progress of each phase of the project.


  1. Budget update

Please use this space to provide an update on your budget, in a suitable format, indicating aspects such as:

  • Project underspend
  • Project overspend
  • Any other relevant aspects in relation to the budget



Description Employee Name Days on project Cost per day (£) Total Cost (£)
PAY COSTS (list all staff – one line per employee)        
Part-time MSc Student/Graduate Teaching Assistant (£14ph)

 1 day = 7hrs

Andrew Dean  40  98.00  3920.00
NON-PAY COSTS (i.e. production of posters or other outputs; hospitalities for events held at UN; attending conferences, etc. – one line per item)        
Dissemination       1000.00
TOTAL COST OF PROJECT        4920.00


Note: The dissemination amount has not yet been completely utilised. However, the money has been kept aside (as agreed by Faculty accountant and Dean) for use in the next academic year. It is intended to be used for registration and travel (possibly) at the Conference on Software Engineering Education and Training, to be held in Munich, Germany from 9-12 November 2020. The publication has been accepted.


  1. Final reflections

Please use this space to add any other comments and reflections on your project, such as lessons learned.


The project is progressing in a positive trajectory and has been very successful so far. We have had an increasing number of course tutors (and students) using the model/tool (GPM) when compared to the previous year. The next phase would be to extend the user base to other subject areas within the university. It would be good to do a full empirical evaluation involving both staff and students with the aim of publishing an article in a reputed journal with high impact factor. As per the discussion with the Head of Learning and Teaching, building an evidence base of use cases would contribute to influencing the policy on assessing group work within the university. There has been some interest from external universities about using this model/tool. The long term aim would be get wider audience from universities across UK higher education using this model/tool.


Please submit this final report to no later than 30th June 2020. Please also make your final report available on your project blog.


We’ll need a couple of weeks to assess your final report and proceed with the release of the final 50% of the fund to your Faculty before the end of the University’s current financial year which is the end of July 2020. Thank you for your cooperation!


Interview – Michael Opoku Agyeman

Michael is a Senior Lecturer in Computing and has group work as part of the modules he teaches. The modules are CSY1024 – Game Techniques (level 4) and CSYM028 – Modern Computer Architecture (level 7). He made use of GPM this year for peer assessment. I interviewed him to know about the model he uses to derive individual student score. The process adopted by him as follows:

Step 1: Calculate group (product) score using rubrics on NILE.

Step 2: Gather peer ratings from group members

Step 3: Calculate individual score by adjusting the group score according to the average peer rating for the individual. This could mean moving the group score up or down by one grade point , according to the average peer rating.

If the group score is high (A+), then it will be capped at that for any further increments of grade points.



New User Interface – TOOL Upgrade

Following pilot trials, the team decided to go for a new version of the tool with major changes to the user interface. A new requirements specification document was finalised with emphasis on specific User Interface functionalities. Futher details can be found here: Vossen 2020-03-06 GPM User Interface Design for Group Setting.

Some other requirements as part of the new version are:

– Option for students to post textual comments together with numerical scores

– Option for tutors to override calculated grade and input some comments (justification).

– Implementation of Both BASS and QASS models. BASS is meant to be the default model used.

– Validation of Input/Export files when imported

– Implementation of Peer Criteria

– Comprehensive testing with test cases for each functionality


Changes during Pilot Trials Set Up

The pilot trials were successfully run for the assessments in the relevant modules. The following development work was done during the period:

To change the rating scale to suit tutor’s requirement (Suraj to update this after his meeting)
To provide a feature to export peer marks from GPM to enable tutor to use his algorithm
  • To provide a feature to delete existing modules on the first webpage.
  • To wipe out all modules and assignments in existing database for a fresh start to test BASS in GPM.
  • To test GPM (BASS).
  • Long term Features Planned: QASS implementation, UI improvement (students feature), Graphs of scores, Compatibility with other universities, separate login account for each student,enable student to give textual feedback/comments on other peers (e.g. rationale for the score)