Production of online training course looking at how tools can be used to support ABL (Active Blended Learning)

B. The creation of three new online training courses which support and promote the use of technology to enhance learning. 

G. Research, testing and implementation of new and existing technologies.

 

Description – What happened?

Between 2016 and 2018 I created three new online courses for training staff in technology for learning. These covered how to use our VLE Blackboard Learn, (Basics_, how to set up and grade assessments (Assessments) and how to use tools to ABL (Active Blended Learning) Enhancements.

This project was commissioned by the University Academic Partnership office to improve training provision for our partner institutions, but was later adopted for use by all tutors at the University of Northampton at the request of our Head of Learning Technology Rob Howe.

Each of the courses were built using our e-portfolio tool Edublogs (WordPress) using the plugin CoursePress. This platform was chosen over the alternative of a Blackboard Organisation, the main benefits of CoursePress over Blackboard Learn Organisations were: 

  • Course users can self enrol on individual training courses.
  • Each course is independent, but accessible from one link.
  • Embedded tests allow users to record their position in each course.
  • CoursePress administrators can run reports on staffs use of the training courses.
  • This platform is responsive and can be better viewed on mobile devices than BlackBoard Learn.

In this portfolio example I will be focusing on the design and build of the third course ‘Enhancements’. The purpose of this course is to support technologies which can be used to facilitate ABL (Active Blended Learning).

I began this third course by meeting with colleagues Al Holloway and Liane Robinson to discuss how we would decide which tools we should include. 

When considering these we used the criteria of whether these were;

  • Reliable and well known to the Learning Technology team.
  • Suitable for collaborative learning.
  • Provide distinct and tangible benefits for staff to enhance their students learning though the ABL pedagogy. 

The tools listed below were chosen as viable options as they had been thoroughly tested and proven to be useful and reliable for staff. In the case of tests we chose to focus on how these could be used in groups for Team based Learning.

  • Discussion Boards
  • Blogs and Journals
  • Virtual Classrooms
  • Videos (to record presentations) 
  • Tests (for time based learning)

After agreeing on this list, I testing each and reviewed our existing guides on these tools. Where guides were either absent, outdated, or missing steps, I updated guidance on our FAQ platform AskUs – adding links to the guidance from tool providers rather than re-writing content where possible.

Within a draft, I included tool guides with Learning Technology case studies. and Learning Design resources from our site S.H.E.D. I also researched and added links to papers published by other academic institutions where they added value to the understanding of these technologies.  I then summarised the benefits and considerations into bullet points for each tool.

After completing a rough version of the course (in MS Word) I again shared this with my colleagues for feedback. One recommendation was from Learning Designer Julie Usher who having identified the benefits of using Peer Evaluation on the LearnTech blog in 2015, suggested this may be worth considering as an additional unit within the course. I began evaluating the two Peer assessment tools institutionally available to staff: The Turnitin Peer Review tool ‘Peer Mark’ and the Blackboard Peer Review tool ‘Self and Peer Assessment’  and ran two live testing sessions with the team. 

Results of live testing of Blackboard Self and Peer Assessment tool:

  • Setting up the BlackBoard self and peer assessment is a logical process.
  • Adding peer review criteria is straightforward.
  • There are a number of useful options in the setup ie. number of peers to review, self-review and anonymous/public.
  • The submissions are as flexible as standard Blackboard assignments – Students can upload any file, (including video and audio via Kaltura via the MashUp Tool)
  • Results are not automatically released to the grade centre 0 There is a moderation area, where tutors can check feedback before releasing to students.
  • Students could easily access the feedback within the familiar area of ‘Feedback and grades’ once released by the tutor.
  • After moderation results are released to the Blackboard grade centre.
  • In testing it was a reliable tool.

 

We also ran testing on the Turnitin PeerMark tool, unfortunately did not function correctly and we were unable to complete the testing (we later discovered this was due to problems with the integration with Blackboard Learn)

However despite these problems, our limited testing and reading of the supplied guidance showed that the Turntin PeerMark tool was less suitable for the following reason:

  • It is designed as a tool for tutors to review student peer feedback rather than for the feedback to be shared with students.
  • The process requires students submit first to a standard Turnitin submission point before reviewing in a second ‘PeerMark’ one.
  • PeerMark only accepts formats such as PowerPoint, PDF, Word, Text. (the same as a normal Turnitin assessment) 
  • Feedback does not release to the grade centre, and therefore the method of viewing feedback is not consistent with other methods of providing feedback in our VLE.
  • The tool was not updated at the same time as Turnitin changed to ‘Feedback studio’ it therefore feels old in comparison to the updated Paper assignments. 

Having attempted to test the two, I chose to look further into the Blackboard ‘Self and Peer review’ and was fortunate to be able to work with the senior lecturer in Social Work, Mark Allenby on a live pilot of the ‘Self and Peer Review’ tool in 2018.

In my case study ‘The case for self and peer assessment in ABL‘ Mark provides his view of the benefits and considerations of using this tool based on his experiences of testing with his class, the case study documents Marks opinion of the tool, the individual feedback of a mature student who records her views on video, and anonymous feedback from the class captured on a Padlet. The overall conclusion from this case study was that the tool was useful, however it required careful use and planning.  Upon these findings, I choose to add the ‘Peer Review’ as a unit within the ‘Enhancements’ course.

A final draft version of the course was reviewed by my colleagues Belinda Green (Learning Technologist ) and Nicola Denning (Learning Designer) and Rob Farmer (Learning Technology manager). It was suggested that the course introduction could be more developed and upon the recommendation of the Head of Learning Technology – Rob Howe, I approached the Dean of Learning within the Institute of Learning and Teaching, Professor Ale Armellini, for his assistance.

With the help of his colleague Dr Ming Nie, Professor Ale Armellini produced an introductory video which discussed the relationship between learning outcomes, aligned activities and NILE tool selection. This was recorded using our took Kaltura and added to the course within the introduction. 

Shortly before the launch of the Enhancements course in October 2019 we became aware that an update to our VLE platform Blackboard Learn had caused the Blackboard ‘Self and Peer Assessment tool’ to stop functioning, I contacted the provider and discovered this was caused by our upgrade and was a known issue that would be addressed in our next upgrade of Blackboard Learn.  With this tool non functioning and no other alternative available for Peer assessment, this area of the ‘Enhancements’ course was removed. 

Feelings – what were you thinking and feeling?

I have always strongly believed there was a need for this course, as tools for ABL is a topic that has regularly been discussed by ourselves in Learning Technology. We have put a great deal of work into providing training for staff through face to face training sessions, one to one meetings, however these have only been available for a small number of staff and transforming this knowledge into a online course has been a goal of mine for some time. 

I was therefore strongly motivated to complete this project, but once I began to build the course I soon found that this topic is more challenging that the other areas of ‘Basics’ and ‘Assessments’. These are the reasons why I think this was more challenging than I anticipated.  

  • I was more familiar with workflows of basics and assessments.
  • When discussing possible technologies for ABL the subject matter became more subjective.
  • The existing guides for these tools were not as well defined as those for the other areas, and therefore needed more revision.
  • Some of these technologies such as Collaborate Ultra are relatively new and therefore there is little evidence of research or LearnTech case studies.
  • It is difficult to write the guidance in a way that provides enough information for broad directions of using the tools but without replicating the step by step guides located within our FAQ platform ‘AskUs’. 

Fortunately I when building the course I was supported by my team and received useful feedback and comments from both Learning Technologists and Learning Designers. I felt it was important to ask for this feedback as the course should represent the knowledge and experience of the whole team and include comments as much as possible.

At times it was difficult for members of the team to find the time to do this, so I had to ensure that I taken the development to another stage before seeking additional feedback and be mindful of my colleagues workloads. 

 

Near the end of the course it was particularly interesting to work with colleagues Professor Ale Armellini and Dr Ming Nie who challenged my ideas how to think about how staff should consider choosing a tool for ABL.   

 

Evaluation – What was good and what was bad about the experience.

The launch of the course has been successful. We currently have 39 students enrolled (13/12/19), and can now provide an online training solution for tutors on technology that supports Active Blended Learning that is both scale-able and available to staff who we traditionally have found difficult to provide training for – such as part time tutors and those based off campus.

It is now very satisfying to be able to have a discussion about ABL with my colleagues in my faculty and send them to link to an online training course for more information. 

However having worked on a unit for self and peer assessment it was disappointing to have to remove this area. 

One personal benefit of creating the course is that I now have am have a much better understanding of the tools I support and feel better able to discuss the benefits and considerations in respect to each, I think it has also created a better dialogue within the team on the use of these technologies and raised awareness of our case studies.

 

Analysis – What sense can you make of the situation

 

It is difficult for me to analyse the success of the course, as I was given this as a project some two years ago and the completion of this is a success in itself. However if I was to think of this objectively I may consider whether the time spent producing this course was a good use of my time. 

I speculate that I’ve spent  200 hours of my time producing this course with approximately 50 hours of others time providing feedback.

In terms of analysing the overall benefits perhaps this best broken down to hours spent designing against hours spent learning?  

For example we could say that if 50 tutors spend 5 hours each on the course then the course could be equal in value to the time spent producing it.

Following this though if we find that 150 tutors only spend an hour each the course it is in deficit – does that mean it is a failure?

I suggest that the proposition above is too simplistic to provide a value on the course, as there are lots of associated benefits that weigh more heavily in it’s favour, for example it has been recently used a by a new member of the Learn Tech team as part of their induction, and is used by similarly by graduate tutors with whom I support in my faculty. 

The reports available within the CoursePress plugin will be valuable in providing data on whether tutors complete the course, I can currently see that the most active users on the course are Learning Technology Staff and Graduate Tutors, but also enrolled are new members of my faculty staff who have enrolled upon my recommendation.

As stated earlier the adoption of online training courses is a new prospect for our University staff and we do not currently have processes in place to best promote these. I am in discussion with the Learning Technology manager Rob Farmer and colleagues on how we can best do this. 

Conclusion – What else could you have done? 

We are shortly due to tender for both a new VLE and for a video hosting solution, the result of this will be that many of the tools that are covered within the course will not be the ones we are supporting in 2021. It is therefore worth remembering that the training materials contained in this course have a limited lifespan.

With this in mind, upon nearing completion of the course I was aware that some areas featured case studies and others did not, rather than wait for more material to become available I felt it was more useful to release the course and update the content as new case studies were produced, as this will allow us to get the most value out of the material we have added.

Given the course is not currently able to recommend a Peer assessment tool, I think in hindsight perhaps it would have been better not to have chosen to include this in the course, but at the time I saw no indication that the tool would stop working, a better option would have been to extend the testing with more staff.

 

Action Plan – If it arose again what would I do?

As mentioned above having created an online course, it will be necessary to update it. 

 

Given that Peer Review could be a very effective method of facilitating Active Blended Learning, I would like to be able to recommend a viable tool for use across the University however this proving difficult. I am currently exploring peer assessment tools in both other VLE systems such as BlackBoard Ultra, D2L and Canvas as well as looking at third party alternatives such as PeerGrade.io, which offers a similar workflow as The Blackboard Self and Peer Review Tool and can integrate with our VLE Blackboard Learn via LTI.

Our current position in procuring a new VLE platform prevents us from considering licensing additional third-party tools as this would involve an additional cost for the University, and additional funding will not be forthcoming until we have completed our tender. 

However it is useful to have knowledge of these systems, especially as the free options in PeerGrade may be useful for some staff. 

I have also recently attended an online meeting ‘design research for peer feedback and assessment in Blackboard Learn’ in which I provided notes to influence the design of Blackboard’s next Self And Peer review tools. I am hopeful that the next generation of VLE platforms will provide tools which are equal if not better than those available in Blackboard Learn. 

In practical terms my next step will be in Summer 2019 when we upgrade to the latest version of Blackboard Learn. If as anticipated this resolves the problem with the ‘Self and Peer Review’ tool we will then consider whether to include it in this ‘Enhancements’ course, or if it would be better to run further testing with staff to check it’s effectiveness and reliability. 

When revisiting the course I would also like to include more case studies and have had discussions with staff in my faculty about the production of case studies for Virtual Classrooms – Collaborate and Using videos for presentations – Kaltura. 

 

We have also recently licensed a virtual post board tool – Padlet for one year, this has already proved to be very effective for facilitating ABL and I have been working with my colleague Belinda Green on FAQs to support the tool and already have a case studies from my faculty. Given we already have much of the materials I would like to add I hope to add this to the course, and will do so once it’s confirmed that the licence will be extended.

 Evidence: Please see examples of the Enhancements course, emails from staff documenting it’s construction and screen shots of the self and peer assessment conducted with Mark Allenby.