(To be completed by projects funded by the Learning and Teaching Enhancement and Innovation Fund 2019-20)
|Key contact details
|Dr Suraj Ajit
|Senior Lecturer/Programme Leader (Computing)
- Project title
Implementation of a fair group marking, and student scoring scheme based upon separate product and process assessment responsibilities
- Project aims and objectives
Please use the table below to provide information on the intended aims and objectives of your project (the ones stated in your project proposal) and the aims and objectives that have been achieved.
|Intended aims and objectives
|Achieved in full?
|To involve students as partners in the assessment of group work (or collaborative learning) in a fair and consistent manner
|The tool/model was deployed in following modules: CSY1019- Software Engineering1, CSY2027 (Group Project – Computing), CSY2027 (Group Project – Networking), CSY1024 (Games Techniques), CSYM028 (Modern Computer Architecture).
|To improve student (and staff) satisfaction by adopting a new approach to group work assessment
|The staff and students who used the model and tool have given positive feedback and are happy with the new approach.
|To inform institutional policy on assessing group work
|The positive results from this project will go a long way in informing the institutional policy on assessing group work (initially in the subject area of computing)
- Project outputs and deliverables
Please use the table below to provide information on your project’s outputs and deliverables (the ones stated in your project proposal) and the outputs and deliverables that have actually been achieved.
|Intended outputs and deliverables
|Achieved in full?
|A web-based tool for assessing group work
|A fully functional tool has been developed and deployed in several modules within computing.
|A technical report detailing all the work done.
|This report gives an overview of the work carried out. However, additional detailed reports are available for specific aspects of the project (e.g. functional specification, user guide)
|A draft for submission at a pedagogical conference/journal (AHE 2020/PRHE Journal)
|AHE 2020 conference was deferred due to COVID -19. Instead, proposals were submitted to Advance HE Learning and Teaching Conference and IEEE Conference on Software Engineering Education and Training. Both were accepted.
|Tool demonstration/showcase and dissemination of findings at UoN L&T Conference
|Yes (in other conferences)
|Dissemination due as follows:
7 – 9 July 2020 – Advance HE Conference, UK [Recorded Talk/Demo].
9-12 November 2020 – IEEE CSEE&T Conference, Munich, Germany
- Project evaluation
Please use this space to provide information on the methods that you used for carrying out an evaluation of the project, and the key findings and results from the evaluation.
The evaluation was carried out in two phases:
Phase1: The Level 4 module Software Engineering1 (CSY1019) has group work as the assessment for Term 1. The students work collaboratively on the assignment tasks and submit a group report. The report is marked by the tutor to allocate a group grade using standard rubrics on NILE. This corresponds to the product score, as per our model. A csv file containing a list of groups and students was created by the module tutor. This file was imported to GPM (our tool). A peer assessment system was then set up on GPM, where each student can rate other members (not including self) in their respective group. GPM generates a URL for peer assessment.
Following the report submission deadline, an announcement about the peer assessment and its URL was made to the students on the module. The students were given a specified date before which they had to complete the peer assessment. The module consisted of around 120 students and they were split into 12 groups with 10 members in each group. Following the peer assessment, the data was exported into an Excel spreadsheet. The module tutor then used this data together with the group grade to calculate individual grades. The tutor used a peer adjustment factor based on his own observations/notes to calculate individual grades. We imported all the group information, peer assessment and group grade data into GPM (proposed tool) version 2. We then used GPM (version 2) to calculate individual grades. We then did a comparative analysis of module tutor’s individual grades with that of individual grades produced by GPM (version 2). We found that there were marginal differences but no significant ones. We then sent the comparative analysis to the module tutor and arranged an interview to discuss about the results produced by GPM (version 1).
- The tutor was happy with the grades produced by GPM (version 2).
- The tutor informed that he would definitely like to use GPM in the future, as he believes it would be a great asset. He informed that he believes it would save time and improve efficiency in the long term.
- The tutor agreed to use it for an additional group project module in the next term.
Phase2: Following phase1, all improvements/changes planned in the proposal were implemented. The new version of the tool, GPM (version 3) was released. The following module leaders of following modules agreed to deploy the tool:
- CSY2027 (Group Project – Computing)
- CSY2027 (Group Project – Networking)
- CSY2027 (Group Project – Software Engineering)
- CSY1024 (Games Techniques)
- CSYM028 (Modern Computer Architecture).
A video tutorial of GPM was created and distributed to the module leaders. The tutors consulted me for some clarification. They used the tool successfully for all the phases: (1) Import CSV file containing list of students (2) Create and disseminate peer assessment link (3) Calculate group (product) score on NILE and enter into GPM (4) GPM combines peer assessment score with group (product) score to calculate individual student score (5) Student scores can be exported to csv file by GPM.
- The tutors (1 and 2, in particular) were very impressed with GPM (version 3) and happy with the individual scores generated by GPM. They did not have to use the option to override any grade.
- One tutor informed that this tool was particularly useful in the COVID-19 situation as face to face meetings with students were not possible.
- There were no issues or complaints from students.
- One tutor forgot the password to the account after setting up groups. The university SMTP server blocks access to applications. Hence, GPM cannot send emails to tutors to reset password. The tutor had to recreate his account. In future, to avoid problems, it is planned to introduce an administrative panel with access rights/privileges defined for each user. An administrator should be able to oversee all other accounts and access data, reset passwords, etc.
- Project impact
Please use the table below to provide information on the intended impact and benefit of your project (the ones stated in your project proposal) and the impact and benefit that have been achieved.
|Intended impact and benefit
|Achieved in full?
|Staff: There is no tool that combines lecturer score and peer (student) score appropriately. Staff currently do it using spreadsheets and find it tedious, time consuming and error-prone.
|The improved tool developed using this project eliminates these issues. Several module leaders used them and provided extremely positive feedback. They intend to use it again.
|Students: The student learning experience may be improved by involving the students in the (peer) assessment process of group work. Students can learn by observing performance of peers and on the different evaluation strategies.
|There were no student complaints or issues after using the proposed model/tool. Students found the process fair and transparent.
|The project has the potential to change practice and inform institutional policy/strategy on assessment of group work
|Evidence of GPM’s usage in increasing number of modules within computing has been obtained over the last two years. The next step would be to expand it to other subject areas.
- Dissemination activities
Please use the table below to provide information on the dissemination activities that have been conducted and their impact.
|A virtual presentation (30 min recorded talk and demo) is to be given at the Advance HE Conference, 7-9 July 2020
|This is expected to contain a large international audience. It is hoped that other universities will express interest in using the model/tool.
|Presentation and full research paper publication at the prestigious IEEE Conference on Software Engineering Education and Training, to be held in Munich, Germany from 9-12 November 2020
|This international conference is highly reputed and the acceptance process was very competitive with an acceptance rate of 37% for full research papers. It is hoped that this publication and presentation would help gain traction among a wider audience.
|The university blog has been updated with the details/progress of each phase of the project.
- Budget update
Please use this space to provide an update on your budget, in a suitable format, indicating aspects such as:
- Project underspend
- Project overspend
- Any other relevant aspects in relation to the budget
|Days on project
|Cost per day (£)
|Total Cost (£)
|PAY COSTS (list all staff – one line per employee)
|Part-time MSc Student/Graduate Teaching Assistant (£14ph)
1 day = 7hrs
|NON-PAY COSTS (i.e. production of posters or other outputs; hospitalities for events held at UN; attending conferences, etc. – one line per item)
|MATCH FUNDING FROM SCHOOL OR DEPARTMENT (if applicable)
|TOTAL COST OF PROJECT
Note: The dissemination amount has not yet been completely utilised. However, the money has been kept aside (as agreed by Faculty accountant and Dean) for use in the next academic year. It is intended to be used for registration and travel (possibly) at the Conference on Software Engineering Education and Training, to be held in Munich, Germany from 9-12 November 2020. The publication has been accepted.
- Final reflections
Please use this space to add any other comments and reflections on your project, such as lessons learned.
The project is progressing in a positive trajectory and has been very successful so far. We have had an increasing number of course tutors (and students) using the model/tool (GPM) when compared to the previous year. The next phase would be to extend the user base to other subject areas within the university. It would be good to do a full empirical evaluation involving both staff and students with the aim of publishing an article in a reputed journal with high impact factor. As per the discussion with the Head of Learning and Teaching, building an evidence base of use cases would contribute to influencing the policy on assessing group work within the university. There has been some interest from external universities about using this model/tool. The long term aim would be get wider audience from universities across UK higher education using this model/tool.
Please submit this final report to Ming.Nie@northampton.ac.uk no later than 30th June 2020. Please also make your final report available on your project blog.
We’ll need a couple of weeks to assess your final report and proceed with the release of the final 50% of the fund to your Faculty before the end of the University’s current financial year which is the end of July 2020. Thank you for your cooperation!