The Role of Virtual Learning Environments in the Online Delivery of Staff Development

Report 1: Review of Experiences of Delivering TALiSMAN Online Courses

Colin Milligan
Institute for Computer Based Learning, Heriot-Watt University, Riccarton, Edinburgh EH14-4AS
November 1998

email: colin@icbl.hw.ac.uk
URL: http://www.icbl.hw.ac.uk/jtap-573

Previous Section - Contents - Next Section


3  COURSE REVIEW

The original online course has now been delivered on 5 different occasions to a total of 284 participants. A slightly greater number of people have signed up for the TALiSMAN Online Study Centre (333 in October 1998). Formal feedback on the TALISMAN Online Course has been collected through electronic feedback forms. For the TALiSMAN Online Study Centre, only informal feedback has been collected - no formal evaluation has yet been carried out by TALiSMAN.

3.1  Online Course Feedback
Formal feedback for the online course attempted to elicit information about the delivery and content of the course as well as providing user profiles, views, previous experience and an indication of participant expectations. We used essentially the same feedback forms for the initial course and the subject-focus courses, but used rather different forms to evaluate the second course. This was necessary as we wished to collect feedback throughout the course rather than at the end. Completion of evaluation forms was optional and we achieved a low rate of return; never rising above one third of participants. As a result, it would be unwise to place too much emphasis on the feedback collected. Perhaps those who filled in the feedback form were those who were most happy with the course? Nevertheless we can draw some specific views and general pointers from the responses and comments collected.

Feedback from individual courses was used to revise subsequent courses. It was partly responsible for the introduction of an academic subject focus, input from external tutors and a change in use of the discussion forum. Feedback also led us to change the way we advertised the course and to clarify its aims.

Feedback was mainly positive, with most respondents indicating that the course met their expectations, that the level of tutor support was appropriate, and that they would be happy to recommend the course to others. The table below summarises the responses to these questions [Table 2].

  Yes No Don't Know
Did the course meet your expectations? 29 10 8
Would you recommend the course to others? 37 2 7
Did you find the level of tutor support adequate? 29 3 12
Did you find the course useful in relation to your work? 39 6 1
Table 2: Quantitative Feedback

Respondents were asked directly about a number of features of the course, the most and least useful aspects of the course, whether they would make any changes to the course, and any other general comments. The responses collected highlighted a number of issues as discussed below.

3.2  Expectations for the Online Course
Although the online course was advertised as introductory level, concerned with exploring the potential of the WWW as an aid to teaching, a number of participants reported quite different expectations. Many participants (especially in the initial delivery) enrolled not to learn, but to experience participation on an online course. Although this was a valid reason for enrolling, these participants were largely familiar with the content of the course already and so their presence changed the behaviour of the group as a whole (they were more active in discussion groups and perhaps discussed different issues).

Some participants expected practical training in the creation of web pages. Whilst we felt that this would have constituted a logical part of the course, it was thought impractical (in such a short course) to attempt to pass on a skill which people learn in very different ways. As other TALiSMAN courses covered web authoring it was felt acceptable to omit it.

Finally, a number of participants expected to be led to teaching materials tailored for their courses. As a generic course, we felt it more appropriate to pass on transferable skills for finding and evaluating learning materials and resources, leaving the academic to use their personal expertise in their own individual field to find specific resources. The subject specific courses delivered in the second year of the project were designed to address this issue. Course participants were directed to CTI centres and Information Gateways, many of which provide libraries of links to materials they have identified as useful and appropriate. Unfortunately, such resources are not available in all subject areas.

3.3  Participation
By far the largest subject for feedback concerned participation. Many participants felt that 2-3 hours a week was not enough to fully complete the course materials. The course materials were largely self contained, but each week, a number of examples referred to external sites. This format encourages the learner to explore the web sites and it is likely that those who spent more than 3 hours each week were those who were most easily side-tracked and explored these web sites more fully than was required. When designing web based materials that refer to external web sites, it is important to structure the materials to try to counter this, especially if the materials are objective led. Some web page designers will go as far as having no external links in their site, whilst others will keep external links to a separate section at the end of the material. A possible solution might be to link to external web sites, but to open all external links in a separate browser window; thus preventing participants from becoming completely detached from the initial material.

Many participants found that it was impossible to set aside 2-3 hours without being interrupted (e.g. by students or phone calls). Others reported that they had worked through the materials in their own time (staying late after work) to avoid interruption.

In planning the course, we were acutely aware that devoting sufficient time to the course by the participants would be a problem. We felt it appropriate to structure the delivery of the course (one lesson a week) rather than just to present the materials as self study. One advantage of this approach is that the regular delivery of chunks of materials can act as a guide through the course content. However, as our courses were not accredited, lecturers were never released from other tasks and there was no carrot which we could offer (or stick we could wield) to encourage or force participants to complete the course on time. Inevitably, a busy lecturer faced with spending 3 hours on the online course, or clearing a backlog of marking will choose the latter and our participants often emailed apologies for non-participation with reasons such as 'I've just been given extra teaching...' or 'It's exam time and I've got lots of marking...' A number of our participants asked whether the course materials would be available after the end of the course, or suggested that the courses themselves could, in future, be run over a longer period.

The main casualty of this inevitable constraint on participation time was the discussion forum. Although we tried to stress the benefits of discussion, we felt it was never used to its full potential, with a generally low level of use in all courses (though discussions in the subject specific courses were slightly more active). Some respondents felt that participants were shy and would have benefited from a face to face session at the start of the course (this was tried for some of the courses but seemed to have little effect on the usage of the discussion forum). A few users remarked that WebBoard was confusing as an interface, but this in itself did not seem to have an effect on the level of use. Realistically, six weeks is too short a time to expect students to become familiar and comfortable with any interface. The choice of discussion forum can significantly affect the way in which online discussion occurs. Although we felt that the Dialogue discussion forum was appropriate for the level of course and number of users, we were concerned that its rudimentary structuring was of little use for reading discussions retrospectively. As we knew that participants might only look in on the discussion area once a week, we felt that such structuring was an important feature. Before the subject specific courses were delivered, we decided to change to a more robust system. We chose to use WebBoard, which provided a simple and familiar interface. WebBoard was easy to integrate with the existing web pages and administration was simple. One criticism of WebBoard was that by being more rigidly structured and clearly presented, discussion may have been stifled, because participants felt that their submissions would come under greater scrutiny.

An alternative possibility might have been to use email as the main discussion medium. Email is familiar to virtually everyone in the community, so no familiarisation would have been required. As most participants check their email every day, they would be able to keep in touch easily with the discussion as it progressed. In practice, this model is unsatisfactory. Most participants tried to set aside time to work through the course in a single session per week, rather than several short sessions. Whilst single email messages might be passed over, and left unread, a single visit to the WebBoard would allow the participant to catch up on many discussions at once. Also, not every participant would be interested in every discussion thread and may come to resent the arrival of emails they considered to be irrelevant. Although we did use email to announce and pace each week of the course, but did not use it to deliver 'content'. If the level of discussion in the forum had been higher, it might have been appropriate to send an email digest of discussions to each participant, once or twice a week.

3.4  Technical Considerations
The course was designed to be accessible by anyone with a graphical web browser. Although we felt we were careful to avoid 'technology for the sake of it' feedback indicated that our participants were most intolerant of any activity which they felt was a waste of their time. For instance, one part of the material discussed the use of plugins and how they can benefit teaching and linked to a site which used the Chime plug-in. Although it was made clear that installing and using the plugin was not necessary, several participants did so, then complained that they had spent a long time on an activity that had very little merit. Although we did not anticipate many problems with access, we did ask for information as to the platform and browser version each participant was using. As a consequence we were able to anticipate which users might have difficulty with some of the materials and act accordingly.

3.5  Online Study Centre Feedback
Feedback from the Online Study Centre has been collected informally through discussions with users as well as through the discussion forum. This feedback is concerned more with the usability of the system than with content or whether it succeeded in its aims of providing an effective and supporting environment for the delivery of self-paced learning materials. There are incidental comments on the learning materials, but no real appraisal. Comments have been fed back to improve the usability of course materials within the Centre. Over the first eleven months of the Online Study Centre, this feedback has resulted in many improvements to course structure, making the materials more varied and active.

As the discussion forum is not used for tutor led discussions, the level of use tends to be low. Participants in the study centre realise that they are engaged in self-paced learning and tend to work independently. Although there are many messages in the system, users are given the option to display only new messages. There tend to be periods of activity as a topic is discussed, then a lull before the next point is raised. The main use of the bulletin board tends to be for technical queries, but it is also used to swap web addresses, allow users to air their own interests and to ask for general advice. As these are answered, they become part of the growing resource of the Study Centre. New users come across answers to commonly asked questions in the bulletin board. This seems to work quite well; if it weren't satisfactory, it would be an easy task to abstract the material to a FAQ (Frequently Asked Questions) page. The Online Study Centre has been running for almost one year now and will soon be internally evaluated as part of TALiSMAN's ongoing analysis of its' courses. The results of this evaluation will be made available when complete.

3.6  External Evaluation
Under the SHEFC commissioned, external evaluation of the TALiSMAN project [13], users of the Online Study Centre were approached independently and asked for feedback. Appendix D of that report provides a case study report from one user and a questionnaire completed by another. The evaluators do point out that those replying to the questionnaire are likely to be those with a more positive attitude to the courses.

From the questionnaire responses, many positive aspects of the Centre are highlighted. This user worked the whole way through courses, re-used the materials, and contributed to the discussion forums and resource database. This user felt the level of tutor support appropriate, felt that materials were of an appropriate level and was happy to (indeed already had) recommended the Centre to others. The user was critical of some of the course structuring and provided suggestions for more interactivity which was felt to be lacking.

The simple structure adopted for the TALiSMAN online course attracted no criticism. The Online Study Centre environment created within WebCT has been criticised by many users who dislike the navigation structure and the way some of the tools function. With the extra functionality being offered, the structure of the Online Study Centre materials is necessarily more complex and perhaps requires some familiarisation or induction period.

As a further part of the external evaluation, a 'typical' user was commissioned to work through all courses in the Online Study Centre and produce a case report. This user had already participated in one of the staged delivery TALiSMAN online courses (and hinted that devoting sufficient time had been difficult). The commissioned evaluator was also generally positive about the Online Study Centre (would recommend it to others) but outlined a number of criticisms. Navigation round the courses was felt to be poor - with 'Home' buttons on different pages having different destinations - confusing to the first time user. This failing is inherent in the structure of WebCT, but we had felt that as the materials were self-paced, familiarisation with the course structure could be gradual, rather than requiring a formal induction. A similar criticism was levelled against the homepages - although potentially useful, few participants create personal homepages and this is probably because the tools provided for this purpose by WebCT are not particularly good. Making a homepage for users of the study centre was entirely voluntary. Had we wanted to make this compulsory, we would have investigated and provided more appropriate tools. The final negative comment regarded a technical issue. The 'Web Images' course required software which was not available for the platform used by this evaluator, and there was some degree of frustration at the amount of time wasted before this was realised. This highlights the need for some formal technical support. Although we do give some guidance about the recommended specification of machines used for viewing the Online Study Centre and do our best to ensure that the learning materials can be fully utilised by as many participants as possible, there can be unanticipated difficulties. Again, with experience, this could be rectified.

On a more positive note, the commissioned evaluator liked all the 'active' elements of the course, such as the discussion forum (which was utilised to solve some problems), instructions to 'TRY IT' in the 'Simple JavaScript' course, the resource database and the quizzes. Interestingly, our evaluator was acutely aware of the differing styles of the course (this argues for standard structures and standard styles) and was much more positive about courses that had been created specifically for the Online Study Centre than for those which had initially been delivered face to face. Indeed, it was recognised that courses originally delivered to a group would be better studied online as a group - with staged delivery and discussions at specific times.

In each of the cases cited above, interactivity (or at least activity) is cited as a key issue. Finding a balance between increasing interactive components and maintaining a focus for the course is vital. It is sobering to note how intolerant our users were of materials not specifically designed for online delivery.

Previous Section - Contents - Next Section


Colin Milligan, ICBL, November 1998 - JTAP-573
Comments to patrick@icbl.hw.ac.uk - colin@icbl.hw.ac.uk - © Heriot-Watt University 1999