|Route map: Home Publications Evaluation StudiesHow to add VALUE|
How to Add - VALUEKirsty Davidson & Judy Goldfinch, Napier University
Over the last few years there has been a very rapid advancement in the development of communication and information technologies. These developments have been both in hardware and supporting infrastructure as well as in software.
The Scottish Universities are fortunate in having broad bandwidth ATM connecting networks which can be used for video conferencing and transferring large amounts of data. The importance of exploiting these Metropolitan Area Networks (MANs) was recognised by SHEFC and various projects were funded under the Use of the MANs Initiative. This paper reports on one of these projects called SUMSMAN (Scottish Universities Maths and Statistics over the Metropolitan Area Networks).
The main aim of the SUMSMAN project is to exploit the potential of the MANs to deliver an integrated programme including multimedia courseware and interactive learning experiences. A report will be given of the problems and achievements found in trying to encourage universities around Scotland to integrate multimedia courseware into their teaching. Views from both staff and students involved in using courseware will be described. A range of evaluation tools are being trialed. These include questionnaires, observation, focus groups and confidence logs. Pre and post tests have been used to attempt to assess the effectiveness of the courseware. Details of these methods, as well as preliminary results and observations will be given.
All Scottish Universities now have studio based video conferencing facilities. As part of the SUMSMAN project, a pilot scheme of video conference lectures which make use of the MANs has been given. Staff and student perceptions of the success of this pilot so far, will be examined.
The capability and usage of Communication and Information Technologies (C&IT) have exploded over the last twenty years. Computers have become so much part of our lives that it is difficult to imagine a time BPC (Before PCs!). The technology has advanced faster than our ability to think up ways of using it to its full potential. Not only is the hardware accelerating in power, but the infrastructure within which PCs are used, is becoming capable of handling more and more electronic traffic.
Dearing recognised that the exploitation of these Communications and Information Technologies holds out much promise for improving the quality, flexibility and effectiveness of higher education (Dearing, 1997). The Dearing Committee envisage that C&IT will overcome barriers to higher education, providing improved access and increased effectiveness, particularly in terms of lifelong learning. Classes in a particular place at a particular time will be replaced by access to resources that are time and place independent.
With the continuing pressure on costs and an increasing demand for places in institutions, it is important that consideration of quality within education is kept very much to the fore and that technology does not drive the strategy.
Over the last few years, there has been a paradigm shift in education, from teaching to learning. The role of the teacher is changing to become more a facilitator of learning rather than an expositor; students are encouraged to take much more responsibility and control of their own learning in an active way rather than in the traditional passive way.
A call for expressions of interest within the European Union in educational software and multimedia, resulted in a high response from enthusiasts convinced of the need for educational multimedia and a general need for a major innovation of the educational system (European Commission 1996). It was recognised that the learning process must become more learner oriented, whereby the learner is actively acquiring knowledge and skills using not only the conventional tools such as books and teachers, but new technology based tools that enable learning at a distance and lifelong learning.
The numbers in higher and further education have increased dramatically over the last ten years; there is a far higher proportion of young people continuing with education after school and many more mature students either retraining or enhancing their skills. In its Green Paper on lifelong learning, the Government proposes to expand further and higher education in England to provide for an extra 500,000 young people and adults by 2002 (The Stationery Office, 1998). It also proposes to widen access to learning in further, higher and adult education and through the University for Industry, which will take its first students in late 1999. The University for Industry will use leading edge technology to make learning available at work, in learning centres, in the community and at home. The Government also intends to raise quality and standards across teaching and learning after the age of 16, by ensuring implementation of the Dearing Committee proposals and by inspection in further and adult education. The issues covered in the Green Paper are relevant in Scotland also.
Education is seen as being imperative for the development of our societies and economies across the world. There is a general recognition of the importance of the role of technology in helping to maintain and improve standards and quality in education, in the face of increasing numbers and pressures on costs. The Dearing Committee noted however that there was as yet, little widespread use of computer-based learning materials (Dearing, 1998). This was attributed in part to
- the limited availability of good materials;
- the time and effort required to redesign programmes to integrate computer-based materials;
- the reluctance of academics to use teaching materials created by others.
This paper describes the evaluation of a project which attempts to integrate two computer-based learning packages, a computer-based assessment system and video conferenced lectures into the first year teaching of several different universities. The topic of the software and the lectures is mathematics and statistics but many of the tools and lessons learnt can be generalised to other topics.
Higher education in the UK has one of the most technologically advanced networks in the world. All higher education institutions in the UK are linked into the network by high speed connections, as are about 90 further education institutions. Electronic traffic over this network has increased 25 fold over the last three years. In addition to these networks, there are even more advanced networks - Metropolitan Area Networks - which permit very high speed, sophisticated high quality network communications. As part of a Government initiative to encourage use of the MANs, Napier University is leading a project called SUMSMAN (Scottish Universities' Mathematics and Statistics across the MANs). This project seeks to find a core curriculum for first year undergraduate teaching in these subjects and to share learning and teaching resources in the form of multimedia software and video conference lectures by means of the MANs. It is also developing further the Mathpool system for storage and retrieval of electronic files relating to mathematics and statistics teaching in Scottish Universities.
In any such project, it is important to report on any problems encountered and lessons learnt, and to establish guidelines for the future. In order to do this, an evaluation strategy, linked to the aims of the project must be built in from the start.
There are four general types of evaluation which have been proposed: formative, summative, illuminative and integrative. The aim of formative evaluation is to help improve the design of the CAL. This is carried out on real students while the software is being developed and there are still resources to modify it. Summative evaluation is generally carried out after the software has been produced and to help users choose which piece of CAL to use and for what. Illuminative evaluation is an open ended method which aims to uncover unexpected important issues in a particular situation. It is a systematic focus on discovering the unexpected. The aim of integrative evaluation is to help users make the most of a given piece of CAL. Often the issue is not whether to use a particular piece of software, but how to make the best use of it. It is a type of formative evaluation, not of the CAL, but of the overall teaching and learning situation; problems identified in the use of the CAL, can be responded to by the teacher e.g. by producing a supplementary handout. In practice there is often overlap of these different evaluation types (Draper 1996).
In order to carry out these evaluations, a variety of techniques can be used, using opinion, memory and observation, and involving experts, teachers who teach the material and actual learners. In general, an expert's opinion is less valuable than that of a teacher trying out the material with students; this in turn is less valuable than the opinion of actual learners (Draper 1995). In the initial stages of evaluation, on the spot observation is generally more valuable than questionnaires and interviews relying on memory, as unexpected problems and issues can be detected. Similarly, in any subsequent questionnaire, it is important to include open ended questions as well as fixed response questions in order to identify more unusual problems or problems not previously identified. The use of split screen video and thinking aloud protocol is a useful technique which can yield surprising insights into a user's perception of a system (Crerar and Davidson 1994).
As the aim of this evaluation was to assess the effectiveness of the different educational interventions, and to explore the barriers to the take-up of the technology, the approach taken was to involve all the participants in the learning experience i.e. the students themselves, the lecturers/tutors and the course co-ordinators. Different evaluation instruments require different amounts of effort on the part of the evaluator and the respondents. Response rates from student questionnaires can be very poor if the questionnaires are not completed during a class. This however requires precious time allocated to teaching. The usefulness of the information can also vary. Questionnaires can be limited in the information provided but are easy and cheap to administer to a large number. Structured open-ended discussion with a group of students can reveal much wider and deeper information, but is much more demanding on resources for larger numbers. In order to take account of these factors, a variety of instruments have been used to try and evaluate the project as fully as possible.
It was recognised that, in some universities, staff within the Maths and Statistics Departments would not necessarily carry out all the teaching of maths and statistics; some service teaching would be carried out by staff from other Departments. In order to ensure consistency, it was decided to restrict the coverage of the project to teaching by staff from within Maths and Statistics Departments in Scotland.
Early on in the project, an evaluation planning questionnaire was sent to contacts within each university, for distribution to other staff within their department. Members of the SUMSMAN team also made visits and presentations to all the universities to ensure that staff were fully appraised. The purpose of the questionnaire was two fold: to make staff aware of the resources available within SUMSMAN and to quantify the intention to use these resources. Six of the thirteen universities responded saying that they would consider using at least one of the 29 topics in Mathwise or Statwise; this covered 31 courses but were mainly from the new universities.
The first step in trying to establish a core curriculum in first year undergraduate teaching in mathematics and statistics was to send a questionnaire to all first and second year module leaders in Maths and Statistics Departments within Scottish Universities asking them to indicate which of a detailed list of topics were included in their teaching.
Video conferenced lectures
Video conference suites are available at all the Scottish Universities. A lot of these are relatively new and no use had been made of them for delivering lectures in maths or statistics prior to the project. Over the first semester of 1997/98, a course in Calculus of Variation, delivered by a member of staff at Napier University, was shared by Napier and Stirling students, using video conferencing over a six week block. A video conference session was also held between Paisley and Napier to introduce and demonstrate one of the Mathwise modules. Feedback from students was obtained using questionnaires and a focus group.
Maths and statistics CBL (Mathwise and Statwise) were made available to all Scottish Universities by FTP over the MANs. In addition maths assessment software (part of Mathwise) was also made available. Use of FTP over the MANs meant that universities could very easily obtain updates of the software.
Views on the software were sought from students via questionnaires and focus groups, two of which were video conferenced. Confidence logs were used to assess any change in the student's confidence as a result of using the software. These were completed after each topic on the software. An expanded version of the confidence log was also used. This version included a short test question which the student attempted prior to using the software for that topic and a similar short test question after using the software. Students were reassured that it was the teaching that was being evaluated, not them.
Questionnaires were used to elicit views of staff. As well as finding out the reactions of staff who had used the software with students, it was felt that it was also important to investigate the reasons why other staff had not used the software with students and different questionnaires were used for staff who had used the software with students and those who had not.
Database of electronic resources
No formal evaluation has yet been done on the use of Mathpool - the database of electronic resources. Questionnaires for users and non-users of the system are planned.
The following paragraphs describe the preliminary results from the evaluation of the project and are based primarily on the first semester of 1997/98. The final report on the project due in August, will include results which also cover the second semester.
The curriculum survey has been completed and the various Mathwise and Statwise modules have been mapped to corresponding areas in the curriculum allowing the potential for use of the Mathwise/Statwise modules to be assessed. The most common topics in the curriculum which are covered by Mathwise modules, are discrete maths, complex numbers, rules of differentiation and max/min. The most common topics covered by Statwise, are probability, normal distribution and regression.
Video conferenced lectures
Staff involved in the video conferenced lectures have not yet been surveyed, but the students involved all completed questionnaires and a focus group was held. Examination results for those students who received a complete set of lectures on a topic were also studied.
Sixteen students attended a one-off video conference introducing one of the CBL and computer assessment packages. Eight students were at Paisley where the author of one of the CBL packages demonstrated the software, and eight were at Heriot-Watt where the assessment package was demonstrated, also by its author. The student comments were predominantly positive, and 87% thought that it was more interesting than an ordinary lecture; however, they also agreed that this was partly because it was a new experience. All but two students said that they would like to have more classes with video conferencing, held in addition to normal classes. Almost half felt that they had learned more than if the same material had been presented in a normal lecture, and felt that the most important advantage of video conferencing was that 'Material can be presented by experts'. The main disadvantage was the small size of the screens.
Twenty students were taught Calculus of Variations by means of a series of six video conferenced lectures. Seven of these were at Napier where the lecturer was and thirteen were at Stirling. The Stirling students followed the lectures with on-site tutorials which they found 'very valuable'. After six sessions, students' feelings were evenly split between positive and mildly negative, whether they were at the remote site or on the home site. Only 19% of the students now thought that video conferenced lectures were more interesting than a normal lecture, and only 24% agreed that they would like to have more video conferenced classes. This perhaps indicates that after six classes the novelty had worn off. All of those wanting more were from the remote site; the home-site students and the other remote students were undecided. The fact that the home-site students had had to travel to a different (and unpopular) campus to attend the lectures was probably significant here. None of these students felt that they had learnt more from the video conferenced lectures than an ordinary lecture - many said that they found it harder to concentrate, particularly those on the remote site. All said that they had sufficient opportunity to ask a question, but many said that they felt slightly more inhibited than usual and did not do so. No one who was on the remote site asked a question. The main advantage that these students saw to video conferencing was that classes can be run even when there are too few students wanting to take that subject at any one university.
Examination results for Calculus of Variations showed that the students had in no way suffered academically as a result of the video conferencing. At Stirling where students had a completely free choice of question (any 6 out of 8), the two Calculus of Variations questions were joint first and third in popularity. On average marks (median) they came first and sixth. At Napier where students had to do two Calculus of Variation questions and two other questions, the two Calculus of Variation questions had the highest average marks.
The way that software is integrated into a course can have a marked effect on how well the software is received. It is not so much the success or failure of particular teaching material that should be studied but the whole teaching and learning situation in which the material is used (Draper et al, 1995). The aim should be to identify which factors are most important in a particular teaching situation and consider how teachers can make better use of software by adjusting how it is used. One teaching method in isolation is rarely the optimal way of encouraging learning. Rather a teacher would employ a number of methods or resources such as lectures, tutorials, handouts, homework exercises, etc. If it became apparent that the software did not handle a particular topic very well, a lecturer could adjust the teaching e.g. with an additional handout.
Different universities integrated the software in different ways. Napier, Paisley and Heriot-Watt used Mathwise during a supervised lab along with lectures and tutorials, Napier used Statwise in the same way. Abertay used Statwise instead of a conventional lecture as directed study but issued a handout including screen dumps and clear directions as to which pages to visit.
Questionnaires were received from 146 students from Napier, 55 students from Paisley and 27 students from Abertay. A video conferenced focus group to discuss the use of Mathwise also took place between Paisley and Napier with five students from each University. An attempt was made to conduct a video conferenced focus group between Napier and Abertay to discuss the use of Statwise, but there was a shortage of volunteers from Abertay. As a consequence, the video conference session went ahead but recorded a focus group for students within Napier and staff from Abertay sharing their experience of using the software with their students.
The main lessons learned so far about how students, from experience, think CBL can be best used in a course are as follows. Where opinions appeared to depend on factors such as how they actually used the material, how often they used it, their course of study, gender age etc. these factors are mentioned; in general, however, the responses were remarkably consistent. The responses received so far are from 246 students at three universities on a wide range of courses: psychology, engineering, mathematics, science, business and accounting. Some had used the CBL only once, but others had used it more than eleven times; the average was five times. Almost all (98%) had used it in supervised labs with about a third also using it in their own time. Only 28% were learning the material for the first time, most were practising material they had just been taught. Most (84%) had been tested on the material of the CBL by the time they completed the questionnaires, 61% of these using the computer for their test. 43% had had paper-based materials specially written to go with the CBL package, but most (76%) had not had to do homework associated with the CBL.
69% found that the methods, formulae etc. in the CBL matched those used in their lectures 'Fairly well' with a further 24% finding it matched 'Very well'; 60% thought that the closeness of the match was 'Fairly important' and 22% 'Very important'. 56% found the CBL 'Quite useful' and 17% 'Very useful', the remainder finding it of only 'Marginal use'. Those who thought that the CBL did not match their lectures 'very' closely were more than twice as likely to find it of only marginal use, and those who thought it matched 'very' closely were more than three times as likely to find it 'very' useful. Students who found the pace of their lectures 'Too slow for me' were three times as likely to find the CBL of marginal use; those who found the lectures 'About right' or 'Too fast for me' differed little in their responses.
Students overwhelmingly (81%) thought that CBL was best used in supervised labs, and that its best uses were for practising material they had just been taught (79%), trying to understand material they were unsure of (56%), and revising before an assessment (53%). Learning material for the first time was thought to be a good use by only 20%. 60% said that an imminent test made them use the CBL more, whether that test was computer or paper-based.
The majority (64%) thought that CBL was best used 'Sometimes alone, sometimes with others'; and 41% would have been willing to pay up to £20 to buy the package for home use.
Students were asked to rank various activities for how effective they had been in helping them learn the subject: on average, 'lectures' and 'tutorials' were ranked roughly equal first, followed by 'supervised CBL', then 'other self study' and finally 'CBL in your own time'. The latter may suffer from the fact that few of them tried this activity and so it had not had a chance to 'help them learn'. In fact, when asked if, ideally, they would like more time per week on each activity, 30% wanted more time on CBL in their own time, whereas only 14% wanted more time spent on lectures and only 24% more time on tutorials; 19% wanted more time spent on supervised labs. Those who had used the statistics package ranked CBL (both supervised and own time) significantly higher than those who had used the Mathwise modules. Rankings differed slightly between universities: Paisley ranked 'other self-study' above either form of CBL, and Abertay ranked supervised CBL above both lectures and tutorials.
Those who had had accompanying paper materials had overwhelmingly (74%) found them 'Very useful' and 60% of those that didn't have them picked these out as something that would have greatly enhanced their learning from the CBL. Other things picked out in this way were: a bulletin board of responses to frequently asked questions, faster and more computer facilities, and marked homework sheets with feedback. Interestingly, of those who had been given homework on the CBL material, those who admitted to not doing it found the CBL significantly less useful.
Of the 110 students who had been formally assessed using the computer-based assessment package, 48% felt that they would have done very similarly on a conventional paper test. 23% felt they would have done worse on a paper test (mainly because they thought it was easier to guess on a computer test), and 29% felt that they would have done better on a paper test (the main reasons given being the difficulty and the time it took to enter answers correctly, that the computer cannot give marks for correct methods or partially correct answers, and that the package did not allow them to go back and change an answer after doing another question).
The majority (63%) reported no technical problems, but some (18%) complained that the packages ran very slowly on their computer networks, and a few (6%) complained about having to change campus to access the CBL. Only 2% reported 'crashing' or similar problems.
Finally, the students were asked to state what they saw as the drawbacks and the benefits of CBL. Tying for most common entry under Drawbacks was 'none', together with 'problems entering mathematical expressions in the test'. Other drawbacks mentioned were that it was 'too easy to just flick through' CBL, that it was 'hard luck if you don't like computers', and that it was 'hard to remember what you had done if there were no paper-based material to go along with it'. In the focus groups a student pointed out that "if you don't grasp something a lecturer can explain it a different way or in a different context to help you understand, whereas a computer can't". Students were also worried about some of the randomly picked computer-based assessment questions being easier than others, and that only answers to questions could be marked, not the development of that answer.
The most commonly mentioned Benefits were, in order: 'good for practising exam questions', 'good for revision', 'can go at one's own pace', 'helps understanding', 'can use at any time', 'can go over something again and again', 'good backup for lectures', 'instant feedback on exercises', and 'provides variety'. In the focus groups a student remarked, tellingly, "I don't think there were any bad points. I think that if you had enough time to sit down and work your way through it methodically, you could learn the subjects well, but it is difficult to get the time (on the computers).& &In an ideal world, this would be like having a tutor in your house.& Another remarked, &Going at your own pace freed up the tutor to help the people who really needed help.& Yet another amplified a remark made on a few questionnaires that they liked being able to choose for themselves what they looked at, many preferring to go almost directly to the examples and exercises and skip the theory at first.
Focus group students were additionally asked to compare CBL to learning from a textbook: students felt that &CBL is more interesting than a dusty book& and that &animations keep you interested&. They also added that, &in CBL you have to try the exercises before you get the solution&.
Results from the basic version of the confidence logs did not supply a great deal of information; about half of the students felt the same confidence about a particular topic, the other half felt more confident.
Results from the version of the confidence logs where the students tackled a short test question both before and after using the software were inconclusive. Results varied by topic. This was thought to be more a product of the questions set rather than as a consequence of the software. For some topics it became apparent that the questions were too long; students did not allow enough time at the end of the class to complete the question and rushed it.
Results from basic confidence logs Mathwise (206 student-topics) Statwise (369 student-topics) less confident now 3% 0% about the same confidence 61% 47% more confident now 31% 41% much more confident now 5% 12%
The staff who had used one of the CBL learning packages supplied by the project covered a much wider age range than those who had not used the CBL. The latter had all taught in HE for over 10 years and were all male, whereas the former were evenly spread from under two years to over ten and were evenly split between the sexes.
Of those who used the CBL, only 40% had actually been involved in the decision to use the CBL with their students, and these people were all involved in the SUMSMAN project or in other CBL projects. Only this 40% had ever used CBL before. 60% of those who had not used CBL before said that their opinions of CBL had improved as a result of using it, as did 25% of those who had used CBL before; the opinions of the rest remained the same. Everyone of them said that they would continue to use the CBL next year, some with additional classes, and all would encourage others to use it.
73% thought that the CBL matched their teaching 'Very closely' and the average time that it was appropriate for was 6.4 weeks. Those who had used CBL previously seemed to get a more positive response from their students than the first timers. Generally, the staff feelings about the best way to use CBL, the benefits, the drawbacks etc. matched those of the students very closely. The main changes in how they would use it next time were to 'introduce the material first' or to 'introduce the students to the package more positively'. The main perceived benefits to staff were overwhelmingly that it saves time preparing exercises, examples, tests and homework, but also that it provides an alternative learning resource and/or experience.
Of the five staff respondents who had not used the CBL, all had a low or fairly low opinion of CBL for maths or statistics. Four of the five had had a detailed look at one of the CBL modules, but only one knew whether there was a module that matched what they were teaching. 60% were unaware that paper materials existed to go with most of the CBL modules, and 40% were unaware that computer-based assessment was available. 80% were unaware that the statistics CBL was available on CD for students to buy.
Reasons given for deciding not to use the CBL were: lack of time to plan for it, doubt over its suitability of level or approach, shortage of labs or hardware, and that it did not match their teaching structure. There was a fear that it might require additional staff time and training. Traditional university teaching methods were mainly used instead, though one team were using prepared booklets and interactive lectures with their students.
Success of the evaluation methods
The main benefit so far has been the extensive feedback we are gathering from students on various different courses, at a range of universities, all using the software in different ways. Eventually this will allow us to establish guidelines for the best ways to use CBL in mathematics/statistics. The questionnaires have been very successful, yielding a lot of information from complete sets of student users. Despite being quite long (four pages) there is little evidence of lack of care in their completion - almost all students were still willing to write sensible and sometimes extensive comments in the open questions at the end.
Focus groups have, as expected, given fuller information though from only a few people. Being a group ensured more detail in the responses in that a comment from one individual would be taken up and developed by others, and in that the interviewer could ask for clarifications or explanations where necessary. To aid this clarification, it was helpful to have the package to hand on a computer. Taping the interviews is vital if points are not to be missed when the discussion gets going. Using video conferencing between groups using the same software but from different universities added to the interest for the students and allowed discussion of new points that had not occurred to one or other of the groups. It was probably, however, somewhat intimidating at first and the discussion took a while to get going. A demonstration of the package was a useful reminder at the start, after an ice-breaking introduction of the participants to each other. In retrospect, clear name badges on the participants would have been a good idea. Five students at each site proved a workable number: fewer would have been difficult, but up to twice that would have been okay.
For the Napier/Paisley video conference focus group, only about half of those who had 'promised' to turn up actually did so at each site, even given the offered 'reward' of a pint in the union afterwards! At the Napier/Abertay video conference focus group, no students could be persuaded to participate at Abertay. The session was conducted as a student focus group at the Napier end, with staff from Abertay sharing their own and their students' experiences with Napier.
Getting information from staff proved much harder than getting it from students. At the start of the project (summer 1997), every Scottish Maths or Stats department was asked to identify people who were willing to consider using one of the CBL packages, and also those who would be willing to help in the evaluation. Towards the end of the academic year 1997/8, questionnaires were sent to all these staff, and, via the SUMSMAN contact at each university, additionally to two or three staff in each department who had decided not to use the CBL.
Those staff who had used one of the CBL packages with their students were generally happy to fill in questionnaires (often preferring to fill in an e-mail version rather than a paper one); eleven have been received so far, plus two additional staff involved in a group interview. However, staff who had chosen not to use a package were very reluctant to fill in a questionnaire; only five have been received at present. Speaking directly to these people elicited reasons for this such as &I am involved in enough evaluations already& or &The questionnaire made me feel guilty for not using the CBL&. Other staff are choosing not to respond yet because they hope to use one of the packages next year and would prefer to respond after that.
Despite the various software resources being made available to all Scottish Universities, there has been a low take up so far, especially from the older universities. This partly explains the low response from staff.
It is difficult to assess the success of the project at this stage since the evaluation is still on-going. However many lessons have been learnt about the relative effectiveness of the evaluation instruments and these will be refined for future use. The evaluation itself has been formative for the project and many of the experiences have led to improvements in the way the technology is integrated. These improvements will themselves be evaluated. A number of staff indicated their intention to use the software in the future, and longer term surveys of usage may indicate increases in the integration of these technologies. Considerable work has still to be done and questionnaires are coming in every day. Usage of the electronic database has still to be assessed. More split screen videos and focus groups are planned, together with interviews of key staff at the end of the year. It will also be interesting to discover whether the initiative continues after the disbandment of the original project team.
Encouraging the integration of new technologies is an iterative process. It can not and should not be expected that all staff will adopt them straight away. There is inevitably an investment of time and effort in investigating the potential of any new technologies and adapting courses to incorporate them. Evaluation studies such as this will hopefully assist, in disseminating lessons learnt about the best ways of integrating them.
Eventually, guidelines, based on what the evaluation has discovered, will be produced on the best ways of using each of the resources and enhancing the learning experience. Hopefully this will overcome some of the barriers identified by Dearing, and lead to wider use of computer-based learning materials. However IT is used in higher education, it must be understood not just as an add-on, but as value-added (D'Andrea, 1997).
Crerar, M.A. and Davidson, K.M., (1994), Teaching and Learning through CAL development: an HCI perspective, XXIX Annual Conference of the Association for Educational and Training Technology
D'Andrea, V., (1997), IT and Dearing: The Implications for HE, CTISS Publications
Dearing, (1997), Report of the National Committee of Inquiry into Higher Education, HMSO
Draper, S.W., Brown, M.I., Edgerton, E.,Henderson, F.P., McAteer, E., Smith, E.D., Watt, H.D., (1994), Observing and measuring the performance of educational technology, TILT, Glasgow University
Draper, S.W., (1995), After Implementation: Evaluating Courseware, Learning Technology Dissemination Initiative Handbook 7th Edition (now Implementing Learning Technology)
Draper, S.W., (1996), Observing, measuring, or evaluating courseware: A conceptual introduction, Implementing Learning Technology, Learning Technology Dissemination Initiative
European Commission, (1996), Report on the outcome of the Call for expression of interest of the Task Force Educational Software and MultiMedia, DG XIII Telecommunications, Information Market and Exploitation of Research
Government Green Paper, (1998), The Learning Age (Cm 3790), The Stationery Office
Available online at: http://www.lifelonglearning.co.uk/greenpaper/index.htm
To contact the maintainers - mail (email@example.com)
HTML by Phil Barker
© All rights reserved LTDI.
Last modified: 29 September 1998.