Final Report 2019-20

Final Report

(To be completed by projects funded by the Learning and Teaching Enhancement and Innovation Fund 2019-20)

 

  Key contact details
Author(s): Dr Suraj Ajit
Job title(s): Senior Lecturer/Programme Leader (Computing)
Faculty: FAST
Email(s): suraj.ajit@northampton.ac.uk
Date submitted: 30/06/2020

 

  1. Project title

Implementation of a fair group marking, and student scoring scheme based upon separate product and process assessment responsibilities

 

 

  1. Project aims and objectives

Please use the table below to provide information on the intended aims and objectives of your project (the ones stated in your project proposal) and the aims and objectives that have been achieved.

 

Intended aims and objectives Achieved in full? Comments
To involve students as partners in the assessment of group work (or collaborative learning) in a fair and consistent manner

 

Yes The tool/model was deployed in following modules: CSY1019- Software Engineering1, CSY2027 (Group Project – Computing), CSY2027 (Group Project – Networking), CSY1024 (Games Techniques), CSYM028 (Modern Computer Architecture).
To improve student (and staff) satisfaction by adopting a new approach to group work assessment

 

Yes The staff and students who used the model and tool have given positive feedback and are happy with the new approach.
To inform institutional policy on assessing group work

 

Yes The positive results from this project will go a long way in informing the institutional policy on assessing group work (initially in the subject area of computing)
  1. Project outputs and deliverables

Please use the table below to provide information on your project’s outputs and deliverables (the ones stated in your project proposal) and the outputs and deliverables that have actually been achieved.

 

Intended outputs and deliverables Achieved in full? Comments
A web-based tool for assessing group work

 

Yes A fully functional tool has been developed and deployed in several modules within computing.
A technical report detailing all the work done.

 

Yes This report gives an overview of the work carried out. However, additional detailed reports are available for specific aspects of the project (e.g. functional specification, user guide)
A draft for submission at a pedagogical conference/journal (AHE 2020/PRHE Journal)

 

Yes AHE 2020 conference was deferred due to COVID -19. Instead,  proposals were submitted to Advance HE Learning and Teaching Conference and IEEE Conference on Software Engineering Education and Training. Both were accepted.
Tool demonstration/showcase and dissemination of findings at UoN L&T Conference

 

Yes (in other conferences) Dissemination due as follows:

7 – 9 July 2020 – Advance HE Conference, UK [Recorded Talk/Demo].

9-12 November 2020 – IEEE CSEE&T Conference, Munich, Germany

 

 

  1. Project evaluation

Please use this space to provide information on the methods that you used for carrying out an evaluation of the project, and the key findings and results from the evaluation.

 

The evaluation was carried out in two phases:

 

Phase1: The Level 4 module Software Engineering1 (CSY1019) has group work as the assessment for Term 1. The students work collaboratively on the assignment tasks and submit a group report. The report is marked by the tutor to allocate a group grade using standard rubrics on NILE. This corresponds to the product score, as per our model.  A csv file containing a list of groups and students was created by the module tutor.  This file was imported to GPM (our tool). A peer assessment system was then set up on GPM, where each student can rate other members (not including self) in their respective group. GPM generates a URL for peer assessment.

 

Following the report submission deadline, an announcement about the peer assessment and its URL was made to the students on the module. The students were given a specified date before which they had to complete the peer assessment. The module consisted of around 120 students and they were split into 12 groups with 10 members in each group. Following the peer assessment, the data was exported into an Excel spreadsheet. The module tutor then used this data together with the group grade to calculate individual grades. The tutor used a peer adjustment factor based on his own observations/notes to calculate individual grades. We imported all the group information, peer assessment and group grade data into GPM (proposed tool) version 2.  We then used GPM (version 2) to calculate individual grades. We then did a comparative analysis of module tutor’s individual grades with that of individual grades produced by GPM (version 2). We found that there were marginal differences but no significant ones. We then sent the comparative analysis to the module tutor and arranged an interview to discuss about the results produced by GPM (version 1).

 

Key findings:

  • The tutor was happy with the grades produced by GPM (version 2).
  • The tutor informed that he would definitely like to use GPM in the future, as he believes it would be a great asset. He informed that he believes it would save time and improve efficiency in the long term.
  • The tutor agreed to use it for an additional group project module in the next term.

Phase2: Following phase1, all improvements/changes planned in the proposal were implemented. The new version of the tool, GPM (version 3) was released.  The following module leaders of following modules agreed to deploy the tool:

  1. CSY2027 (Group Project – Computing)
  2. CSY2027 (Group Project – Networking)
  3. CSY2027 (Group Project – Software Engineering)
  4. CSY1024 (Games Techniques)
  5. CSYM028 (Modern Computer Architecture).

A video tutorial of GPM was created and distributed to the module leaders. The tutors consulted me for some clarification. They used the tool successfully for all the phases: (1) Import CSV file containing list of students (2) Create and disseminate peer assessment link (3) Calculate group (product) score on NILE and enter into GPM (4) GPM combines peer assessment score with group (product) score to calculate individual student score (5) Student scores can be exported to csv file by GPM.

Key findings:

  • The tutors (1 and 2, in particular) were very impressed with GPM (version 3) and happy with the individual scores generated by GPM. They did not have to use the option to override any grade.
  • One tutor informed that this tool was particularly useful in the COVID-19 situation as face to face meetings with students were not possible.
  • There were no issues or complaints from students.
  • One tutor forgot the password to the account after setting up groups. The university SMTP server blocks access to applications. Hence, GPM cannot send emails to tutors to reset password. The tutor had to recreate his account. In future, to avoid problems, it is planned to introduce an administrative panel with  access rights/privileges defined for each user. An administrator should be able to oversee all other accounts and access data, reset passwords, etc.

 

  1. Project impact

Please use the table below to provide information on the intended impact and benefit of your project (the ones stated in your project proposal) and the impact and benefit that have been achieved.

 

Intended impact and benefit Achieved in full? Comments
Staff: There is no tool that combines lecturer score and peer (student) score appropriately. Staff currently do it using spreadsheets and find it tedious, time consuming and error-prone. Yes The improved tool developed using this project eliminates these issues. Several module leaders used them and provided extremely positive feedback. They intend to use it again.
Students: The student learning experience may be improved by involving the students in the (peer) assessment process of group work. Students can learn by observing performance of peers and on the different evaluation strategies. Yes There were no student complaints or issues after using the proposed model/tool. Students found the process fair and transparent.   
The project has the potential to change practice and inform institutional policy/strategy on assessment of group work On progress Evidence of GPM’s usage in increasing number of modules within computing has been obtained over the last two years. The next step would be to expand it to other subject areas.

 

  1. Dissemination activities

Please use the table below to provide information on the dissemination activities that have been conducted and their impact.

Dissemination activities Impact
A virtual presentation (30 min recorded talk and demo) is to be given at the Advance HE Conference, 7-9 July 2020 This is expected to contain a large international audience. It is hoped that other universities will express interest in using the model/tool.
Presentation and full research paper publication at the prestigious IEEE Conference on Software Engineering Education and Training, to be held in Munich, Germany from 9-12 November 2020 This international conference is highly reputed and the acceptance process was very competitive with an acceptance rate of 37% for full research papers. It is hoped that this publication and presentation would help gain traction among a wider audience.
Blog The university blog has been updated with the details/progress of each phase of the project.

 

  1. Budget update

Please use this space to provide an update on your budget, in a suitable format, indicating aspects such as:

  • Project underspend
  • Project overspend
  • Any other relevant aspects in relation to the budget

 

 

Description Employee Name Days on project Cost per day (£) Total Cost (£)
PAY COSTS (list all staff – one line per employee)        
Part-time MSc Student/Graduate Teaching Assistant (£14ph)

 1 day = 7hrs

Andrew Dean  40  98.00  3920.00
         
NON-PAY COSTS (i.e. production of posters or other outputs; hospitalities for events held at UN; attending conferences, etc. – one line per item)        
Dissemination       1000.00
         
TOTAL REQUESTED        4920
MATCH FUNDING FROM SCHOOL OR DEPARTMENT (if applicable)        
TOTAL COST OF PROJECT        4920.00

 

Note: The dissemination amount has not yet been completely utilised. However, the money has been kept aside (as agreed by Faculty accountant and Dean) for use in the next academic year. It is intended to be used for registration and travel (possibly) at the Conference on Software Engineering Education and Training, to be held in Munich, Germany from 9-12 November 2020. The publication has been accepted.

 

  1. Final reflections

Please use this space to add any other comments and reflections on your project, such as lessons learned.

 

The project is progressing in a positive trajectory and has been very successful so far. We have had an increasing number of course tutors (and students) using the model/tool (GPM) when compared to the previous year. The next phase would be to extend the user base to other subject areas within the university. It would be good to do a full empirical evaluation involving both staff and students with the aim of publishing an article in a reputed journal with high impact factor. As per the discussion with the Head of Learning and Teaching, building an evidence base of use cases would contribute to influencing the policy on assessing group work within the university. There has been some interest from external universities about using this model/tool. The long term aim would be get wider audience from universities across UK higher education using this model/tool.

 

Please submit this final report to Ming.Nie@northampton.ac.uk no later than 30th June 2020. Please also make your final report available on your project blog.

 

We’ll need a couple of weeks to assess your final report and proceed with the release of the final 50% of the fund to your Faculty before the end of the University’s current financial year which is the end of July 2020. Thank you for your cooperation!

 

Interview – Michael Opoku Agyeman

Michael is a Senior Lecturer in Computing and has group work as part of the modules he teaches. The modules are CSY1024 – Game Techniques (level 4) and CSYM028 – Modern Computer Architecture (level 7). He made use of GPM this year for peer assessment. I interviewed him to know about the model he uses to derive individual student score. The process adopted by him as follows:

Step 1: Calculate group (product) score using rubrics on NILE.

Step 2: Gather peer ratings from group members

Step 3: Calculate individual score by adjusting the group score according to the average peer rating for the individual. This could mean moving the group score up or down by one grade point , according to the average peer rating.

If the group score is high (A+), then it will be capped at that for any further increments of grade points.

 

 

New User Interface – TOOL Upgrade

Following pilot trials, the team decided to go for a new version of the tool with major changes to the user interface. A new requirements specification document was finalised with emphasis on specific User Interface functionalities. Futher details can be found here: Vossen 2020-03-06 GPM User Interface Design for Group Setting.

Some other requirements as part of the new version are:

– Option for students to post textual comments together with numerical scores

– Option for tutors to override calculated grade and input some comments (justification).

– Implementation of Both BASS and QASS models. BASS is meant to be the default model used.

– Validation of Input/Export files when imported

– Implementation of Peer Criteria

– Comprehensive testing with test cases for each functionality

 

Changes during Pilot Trials Set Up

The pilot trials were successfully run for the assessments in the relevant modules. The following development work was done during the period:

To change the rating scale to suit tutor’s requirement (Suraj to update this after his meeting)
To provide a feature to export peer marks from GPM to enable tutor to use his algorithm
  • To provide a feature to delete existing modules on the first webpage.
  • To wipe out all modules and assignments in existing database for a fresh start to test BASS in GPM.
  • To test GPM (BASS).
  • Long term Features Planned: QASS implementation, UI improvement (students feature), Graphs of scores, Compatibility with other universities, separate login account for each student,enable student to give textual feedback/comments on other peers (e.g. rationale for the score)

Progress Update – Pilot Trials of the Improved Tool

The new changes proposed to the tool (see post 1 of Phase2) were all implemented.  Three modules were chosen to do the pilot trials. They were:

Level 4 modules:

CSY1043: Fundamentals of Computing Systems (Tutor: Dr Michael Opoku)

CSY1019: Software Engineering1 (Tutor: Mark Johnson)

Level 7 Module:

CSYM028: Modern Computer Architecture (Tutor: Michael Opoku)

All the above modules had group work as part of their assignment.  The modules were all set up by the tool. A list of all the groups were then imported from a csv file. This csv file could have been exported from NILE (NILE allows you to form student groups). A web link (URL) to enable peer marking was generated by the tool for each module. This link enables each student to sign in using the student id and then rate each member in their group on a scale of 0-5. 0 shows little to none of the expected contribution and 5 demonstrates a large contribution to the final piece of work. Self-assessment or rating your own self was not permitted (as opposed to Spark Plus). An announcement was placed in each of the above modules to inform the students about the link and procedure for peer marking. Most students participated in the peer marking process. A few students who did not participate were chased using NILE reminders/emails.

A group technical report and presentation were submitted by each group as the final piece of work. This is now being assessed/graded by the tutors using rubrics on NILE.  The next step would be to enter the group grade into the tool (GPM) for each group. The tool would then combine the group grade with the average peer score received by each student to calculate the individual grade for each student.

The final list of grades for each student may then be exported to a csv file. It is also possible to export only the peer scores received by each student to a csv file.

 

 

Progress Update – SparkPlus – Claire Leer Interview

An interview was conducted with Claire Leer,  Senior Lecturer in Business School in the first week of January 2020. Claire has been using SparkPlus, another group/peer assessment tool for the last 4 years for her Event Management undergraduate course. It is used in one of the modules consisting of over 100 students. The license is paid for annually by the university and costs about £1500/year. A walkthrough of the SparkPlus tool was given to me during the interview. Key points from the interview:-

– SparkPlus has been useful for Claire. She likes the functionality offered by SparkPlus. The main/unique feature of the tool that she likes is that the tool allows students  to provide textual comments in addition to ratings/scores to each other group member. These comments are moderated by Claire and then published anyonymously.

– SparkPlus provides an individual account for each student where he/she can login. The tool is used by Claire in two phases: (a) the students are asked to provide a formative assessment of their peers in the group. This assessment includes rating them using a scale and providing textual comments on their contribution. Claire then organises a meeting with each group to discuss the feedback provided by peers and resolve any issues. (b) the students are asked to provide a summative assessment of the their peers in the group. The scores are generally improved as a result of the formative assessment.

Claire finds the user interface of teh tool quite clumsy and not very intuitive. She was supportive of developing a new in-house tool within the university.

 

 

 

Progress Update – Implementation

The phase 2 of this project has started off very well. The following changes were implemented to the software tool (GPM):

1. New BASS Model: Boosted Aritmetic Scoring Model, a new model to calculate individual student score was implemented.

2. Ability to add peer criteria: The tutor can add a set of critieria for peer marking.

3. Tolerance Factor (z): The tolerance factor determines the extent to which an individual student’s score deviates from the group score. The default value is 2 but the tutor can adjust this value using a slider.

4. Impact Factor (p): The tutors can adjust the impact that peer assessment scores have over an individual’s final score/grade using a slider. The default value is 1 and when p = 0, peer assessment has no impact at all.

5. Overriding/Adjusting student score/grade: GPM now enables the tutor to override the student score/grade calculated by the model.

6. NILE Import Format: Groups can be allocated and set up on NILE by tutors. NILE has an export feature to export the list of groups (with members) to a csv file. GPM now has the ability to import this file.

7. URL changes: The tool is now available at http://www.computing.northampton.ac.uk/~gpm/

Watch this space for more updates…

 

 

 

Phase 2 – For Academic Year 2019-20

Following our work in 2018-19, I am extremely pleased to say that we have won the Innovation Bid again for 2019-20. The project team would consist of the same members.

The evaluation results of our previous project were highly encouraging. All the module leaders who used the tool were impressed with it and would like to continue to use it in the future (subject to some changes). The other module leaders who did not use the tool provided clear explanation on what changes they would like to see in the tool before they can use it. The main changes/additions needed that we intend to work during this academic year are as follows:

  • Addition of process quality criteria: Currently, the peer assessment for process is calculated based on one holistic criterion. Ideally, the process needs to be assessed based on various criteria such as commitment, coordination, communication, etc. that can be set by the tutor/lecturer. The scores of all these criteria have to be aggregated using the appropriate weightings (if applicable).
  • Process scoring by tutor: Normally, process is assessed by students using peer assessment. However, in some cases, the tutors may want to either assess these based solely on his/her observations or override the peer assessment scores.
  • Tolerance Factor (z) of Student score: The tolerance factor determines the extent to which an individual student’s score deviates from the group score. The default value is 2 but the tutor should be able to adjust this value.
  • Peer Assessment Impact Factor (p): The tutors should be able to adjust the impact that peer assessment scores have over an individual’s final score/grade. The default value is 1 and when p = 0, peer assessment has no impact at all.
  • Overriding Final Student Score/Grade: Tutors would like to have the flexibility to override the resulting student score/grade in the case of exceptional contextual evidence which was not foreseen by the model’s process criteria, or parameters z and p. In such cases, the tool should facilitate the tutor to clearly state the reason for overriding the score/grade. This would be useful for both students and external examiners to know.
  • Usability: Several changes are to be made to the tool to make it more intuitive for tutors to use. These changes will be based on ongoing feedback provided by the users. For example, one tutor stated that he would like the flexibility to make any changes after students are allocated to specific groups.

Updates on the progress of this project would be posted on this site.

– Suraj Ajit

Final Report

Final Report

(For projects funded by the Learning and Teaching Enhancement and Innovation Fund 2018-19)

 

  Key contact details
Author(s): Dr Suraj Ajit
Job title(s): Senior Lecturer/Programme Leader (Computing)
Faculty: FAST
Email(s): suraj.ajit@northampton.ac.uk
Date submitted: 30/06/2019

 

  1. Project title

Implementation of a fair group marking, and student scoring scheme based             upon separate product and process assessment responsibilities.

 

  1. Project aims and objectives

Please use the table below to provide information on the intended aims and objectives of your project (the ones stated in your project proposal) and the aims and objectives that have been achieved.

 

Intended aims and objectives Achieved in full? Comments
To involve students as partners in the assessment of group work (or collaborative learning) in a fair and consistent manner

 

Yes  The proposed model/tool was deployed in two modules – Software Engineering 1 (Year 1) and Group Project (Tear 2). This involved both students and staff.
To improve student (and staff) satisfaction by adopting a new approach to group work assessment

 

Yes The staff and students who used the model and tool in the two modules have given positive feedback and are happy with the new approach.
Recommendations/roadmap to inform institutional policy on assessing group work

 

Yes  A demonstration of the model and tool was given to Dr. Rachel Maxwell (Head of Learning and Teaching).  A meeting was arranged with her to discuss and finalise the roadmap.

 

  1. Project outputs and deliverables

Please use the table below to provide information on your project’s outputs and deliverables (the ones stated in your project proposal) and the outputs and deliverables that have actually been achieved.

 

Intended outputs and deliverables Achieved in full? Comments
A technical report detailing all the work done. In particular, this report would include details on key findings from staff and student interviews, evaluation of current approaches to assessing group work with the university, feasibility(pilot) study of using a novel approach to group work assessment, roadmap/recommendations to inform institutional policy on assessing group work

 

  Yes

 

This report provides an overview of the work done including key findings. However, it could be expanded to provide a more detailed report.
A draft for submission at a pedagogical conference/journal (AHE 2019/PRHE Journal)

 

 Yes  Two submissions were made to two peer-reviewed international conferences (AHE 2019 and Advance HE Teaching and Learning Conference 2019). Both were accepted. The work was presented at AHE 2019 last week.
A blog, detailing the progress/results of each phase of the project

 

 Yes The details/progress of each phase of the project have been provided at https://mypad.northampton.ac.uk/groupmarking/

 

  1. Project evaluation

Please use this space to provide information on the methods that you used for carrying out an evaluation of the project, and the key findings and results from the evaluation.

 

The evaluation was carried out in two phases:

 

Phase1: The Level 4 module Software Engineering1 (CSY1019) has group work as the assessment for Term 1. The students work collaboratively on the assignment tasks and submit a group report. The report is marked by the tutor to allocate a group grade using standard rubrics on NILE. This corresponds to the product score, as per our model.  A peer assessment system was then set up on NILE, where each student can rate other members (not including self) in their respective group. The NILE peer assessment system was activated only after the group report was submitted by the students. The format of the peer assessment system was as follows (table):

 

Following the report submission deadline, an announcement about the peer assessment was made to the students on the module. The students were given a specified date before which they had to complete the peer assessment. The module consisted of around 180 students and they were split into 18 groups with 10 members in each group. Following the peer assessment, the data was exported into an Excel spreadsheet. The module tutor then used this data together with the group grade to calculate individual grades. The tutor used a peer adjustment factor based on his own observations/notes to calculate individual grades. We imported all the group information, peer assessment and group grade data into GPM (proposed tool) version 1.  We then used GPM (version 1) to calculate individual grades. We then did a comparative analysis of module tutor’s individual grades with that of individual grades produced by GPM (version 1). We found that there were marginal differences but no significant ones. We then sent the comparative analysis to the module tutor and arranged an interview to discuss about the results produced by GPM (version 1).

 

Key findings:

  • The tutor was happy with the grades produced by GPM (version 1). However, the tutor did not agree with the need to conform to split-join principle.
  • The tutor informed that he would like the flexibility to adjust individual grades. He is happy to use the grades used by GPM (version 1) as the base grades.
  • The tutor informed that he would definitely like to use GPM in the future, as he believes it would be a great asset. He informed that he believes it would save time and improve efficiency in the long term.
  • The suggested a few features that we would like added to the tool. These were minor features related to import/export feature, inclusion of student names, etc.
  • He particularly liked the zoom factor feature that can let him amplify the effect of peer assessment on the grade calculation.

Phase2: Following phase1, several improvements/changes were made to the tool. A new version of the tool, GPM (version 2) was released. A presentation was given to the computing staff members. The Level 5 Group Project module for Computing (Network Engineering) was now used for evaluation. This module consisted around 20 students.  The module tutor met the tool software developer to get a walkthrough session about using the tool. The tutor input the student details and formed groups. The report submitted by each group was marked using rubrics on NILE to get the group grade. Peer assessment (on this occasion) was done using GPM (version 2). A unique link generated by GPM for each group was sent to the group members for the peer assessment. GPM was then used to calculate individual grades.

Key findings:

  • The tutor was very impressed with GPM (version 2). He said that it is easier, efficient and more accurate compared to his current process.
  • He informed us that he received positive feedback from the students regarding the use of GPM. There were no issues or complaints from students.
  • He informed us that he would like to be able to easily reallocate a student to the right group, in case, he made an error on group allocation in the first iteration.

 

  1. Project impact

Please use the table below to provide information on the intended impact and benefit of your project (the ones stated in your project proposal) and the impact and benefit that have been achieved.

 

Intended impact and benefit Achieved in full? Comments
Staff: There is no tool that combines lecturer score and peer (student) score appropriately. Staff currently do it using spreadsheets and find it tedious, time consuming and error-prone.    Yes The tool developed using this project eliminates these issues. Two module leaders used them and provided extremely positive feedback. They intend to use it again.
Students: The student learning experience may be improved by involving the students in the (peer) assessment process of group work. Students can learn by observing performance of peers and on the different evaluation strategies.    Yes There we re no student complaints or issues after using the proposed model/tool in two modules. Students found the process fair and transparent.
The project has the potential to change practice and inform institutional policy/strategy on assessment of group work.  Partly (In progress) A roadmap has been developed following meeting with the Head of Learning and Teaching. To put into practice, more evidence of widespread usage, both within computing and other subject areas in the faculty needs to be gathered.

 

  1. Dissemination activities

Please use the table below to provide information on the dissemination activities that have been conducted and their impact.

Dissemination activities Impact
A presentation of the project was given to all computing staff at our regular staff meeting on 27th March 2019 Five module leaders expressed interest in trying out the proposed model/tool for their modules.
International Assessment in Higher Education Conference 2019, Manchester A presentation was given to a large international audience. Several people engaged and asked questions. It is hoped that a network will be established for further interaction.
Blog The details/progress of each phase of the project have been provided at https://mypad.northampton.ac.uk/groupmarking/
Advance HE Teaching and Learning Conference 2019, Newcastle Yet to Present (Conference is scheduled for next week)

 

 

  1. Budget update

Please use this space to provide an update on your budget, in a suitable format, indicating aspects such as:

  • Project underspend
  • Project overspend
  • Any other relevant aspects in relation to the budget

 

Description Employee Name Total Cost (£)
PAY COSTS (list all staff – one line per employee)    
Graduate Teaching Assistant (£14ph)

1 day = 7hrs

(40 days)

 Andrew Dean £3920
Conferences*  discounted early bird £392
TOTAL SPEND/AMOUNT EXPECTED   £4312
ORIGINAL ALLOCATED COSTS    £4312

 

*Please note that funding/support from the Faculty was used to cover any additional conference expenses.

 

 

  1. Final reflections

Please use this space to add any other comments and reflections on your project, such as lessons learned.

 

The project has been very successful in terms of the aims and objectives it set out to achieve. There are two aspects to this project: one is about the model and another is about the tool. It is possible that one may want to adopt the proposed model but not the proposed tool. This means that staff may want to adopt the model using Excel spreadsheets but not by using the tool. The project has emphasised that there are many issues/problems of group work faced by both staff and students. It is a huge challenge to get someone to change their existing approach of assessing group work that one may have been adopting for many years. People are generally reluctant to invest any time to learn to use a new model or tool. Although five module leaders expressed interest in adopting/evaluating the proposed model and tool, only two did that.  It is important for the subject leader to further support the proposed model/tool by emphasising other module leaders to adopt/evaluate it. It is also important to gather further evidence to appropriately criticise some of the other approaches adopted by module leaders. Further work is needed to extend both the model and tool to make it flexible enough to accommodate some slightly different approaches but conforming to one underlying framework. The next step would be to get one big subject area such as computing to adopt a unified framework. This should then be expanded to other subject areas within the faculty. The Head of Learning and Teaching has been impressed with the project results so far. She is happy for this work to influence the existing institutional policy on assessing group work when enough evidence (including more use cases) has been gathered within the Faculty to support it.

 

 

Please submit this final report to Ming.Nie@northampton.ac.uk no later than 30th June 2019. Please also make your final report available on your project blog.

 

We’ll need a couple of weeks to assess your final report and proceed with the release of the final 50% of the fund to your Faculty before the end of the University’s current financial year. Thank you for your cooperation!

 

 

Successful release of GPM Version 2

Following the feedback received from the pilot trials, several improvements were made to the functionality and usability of the tool. At the end of March, the improved tool was presented by Andy and me (Suraj) to the computing staff at a regular weekly team meeting. The computing staff members teaching/assessing group work were requested to volunteer to take part in the evaluation phase of this project. Computer Networking group project module, Computing group project module  and Business Computing group project module tutors agreed to v take part in the evaluation phase. Following the presentation, one tutor  requested a feature to import and export individual student grades of each group in CSV/Excel format. This feature was then incorporated.

The tool is also now hosted on the computing web domain at:

http://www.computing.northampton.ac.uk/~andy/#/login

If you would like to get further information or view a demonstration, please contact suraj.ajit@northampton.ac.uk.