Phase 2 – For Academic Year 2019-20

Following our work in 2018-19, I am extremely pleased to say that we have won the Innovation Bid again for 2019-20. The project team would consist of the same members.

The evaluation results of our previous project were highly encouraging. All the module leaders who used the tool were impressed with it and would like to continue to use it in the future (subject to some changes). The other module leaders who did not use the tool provided clear explanation on what changes they would like to see in the tool before they can use it. The main changes/additions needed that we intend to work during this academic year are as follows:

  • Addition of process quality criteria: Currently, the peer assessment for process is calculated based on one holistic criterion. Ideally, the process needs to be assessed based on various criteria such as commitment, coordination, communication, etc. that can be set by the tutor/lecturer. The scores of all these criteria have to be aggregated using the appropriate weightings (if applicable).
  • Process scoring by tutor: Normally, process is assessed by students using peer assessment. However, in some cases, the tutors may want to either assess these based solely on his/her observations or override the peer assessment scores.
  • Tolerance Factor (z) of Student score: The tolerance factor determines the extent to which an individual student’s score deviates from the group score. The default value is 2 but the tutor should be able to adjust this value.
  • Peer Assessment Impact Factor (p): The tutors should be able to adjust the impact that peer assessment scores have over an individual’s final score/grade. The default value is 1 and when p = 0, peer assessment has no impact at all.
  • Overriding Final Student Score/Grade: Tutors would like to have the flexibility to override the resulting student score/grade in the case of exceptional contextual evidence which was not foreseen by the model’s process criteria, or parameters z and p. In such cases, the tool should facilitate the tutor to clearly state the reason for overriding the score/grade. This would be useful for both students and external examiners to know.
  • Usability: Several changes are to be made to the tool to make it more intuitive for tutors to use. These changes will be based on ongoing feedback provided by the users. For example, one tutor stated that he would like the flexibility to make any changes after students are allocated to specific groups.

Updates on the progress of this project would be posted on this site.

– Suraj Ajit

Leave a Reply

Your email address will not be published. Required fields are marked *