LTDI LEARNING
TECHNOLOGY
DISSEMINATION
INITIATIVE

The Luton Experience


Route map: Home => publications => AssessIT=>The Luton Experience

by Stan Zakrzewski

Link to contents page
link to next page

Since 1993 The University of Luton has been committed to the development of a university wide computerised assessment strategy, using "Question Mark Designer". In 1996/97 approximately 1000 students sat formative assessments and a further 9000, summative assessments.

Initially 150 psychology students sat interactive formative assessments, but the system has expanded annually as more subjects and courses have elected to move towards computerised assessments. Subjects involved during this year include midwifery, biology and law for formative assessments, and marketing, politics, maths, sports science, psychology, computing, biology, materials science, economics and communications for summative assessments.

The university has seven computer areas with a total of 200 workstations, linked by a Novell network. Where a module size exceeds 200 students and two sittings of the examination are required, they are timed to be immediately consecutive, and invigilators are used to ensure that the two groups are kept apart. As with other assessments, timetabling is controlled by the exams office.

Academic staff are encouraged to come to the central support unit, ULTRA, to discuss their requirements for any assessment in plenty of time. Arrangements are made for them to see another examination taking place, and they are put in touch with other staff who are involved in computerised assessments. The Question Mark package offers a range of different question styles and types. Issues of question design are discussed with interested staff, and a "Question Mark" users group meets on a regular basis to discuss good practice and exchange ideas. A typical paper would contain perhaps 60-80 questions for a one hour examination. Additionally about 10 sample questions are provided on the network 3-4 weeks before the examination to allow students to familiarise themselves with the technology. Having designed the questions, academic staff are released from using the software, because the questions are entered by administrators.

Paper copies of the exam are checked by staff and external examiners, then both a paper copy and a disc copy are lodged in the exams office. On the day of the examination the invigilator collects hard copies of the exam. Learning resources staff are responsible for collecting the electronic copy, and for loading it onto the appropriate server(s). Before students are allowed to enter the examination area a final test of the system is made to ensure that all files are operating correctly. On completion of the examination answer files and results are collated and sent to the exams office.

A number of emergency procedures have also been specified in case of any problems occurring. Having such procedures clearly defined in advance of any problems is both helpful and reassuring for the invigilators and students who might be involved. The examination is not loaded onto the network until immediately prior to the exam, hence security is not considered to be a problem. All networks are tested after loading the exam, but before the students enter the hall. There are always a couple of spare machines in the examination hall available in the case of failure of an individual machine. If failure occurs within 15 minutes of the start of the exam the student is moved to a free machine and begins the exam again. If the failure occurs after more than 15 minutes, this could lead to timetable clashes and the student is asked to complete the remaining part of the examination in a paper based format. If an entire network fails, the examination would be rescheduled.

Objective testing can include a wide range of question styles and for some courses may be a more suitable form of assessment than more traditional methods. The system also provides immediate feedback on performance, which is much appreciated by students. Staff can also use this for prompt diagnosis of any important areas of difficulty. Automatic marking, no second marking and comprehensive statistical analysis of results release staff to pursue other areas of interest.

After four years of experience, the keypoints that have emerged are:

Dr. Stan Zakrzewski
The University of Luton
Unit for Learning Technology, Research and Assessment
Park Square
Luton LU1 3JU
Tel: 01582 743294
E-Mail : ultra@luton.ac.uk


Link to:
Publications
Publications
Home
Home
Contact details
Contact details

To contact the maintainers - mail (ltdi@icbl.hw.ac.uk)
HTML by Phil Barker
© All rights reserved.
Last modified: 07 December 1999 (formatting),
Last modified: 26 September 1997 (first web version),
First published: April 1997.