1. Plan Hint
The purpose of the evaluation will determine what data needs to be logged, how it should be recorded and summarised. The evaluation plan provides the specification and level of detail for establishing the logging process. This may be 'high level', e.g. number of logins and time on task per student, or 'low level', e.g. key strokes, mouse clicks, navigation choices and frequency of errors. Logging is done by a program running on the server from which the courseware is accessed.
2. Data Collection
Raw data from the logging process is imported into a statistical package such as SPSS, StatView or Excel, then sorted into categories defined according to purpose, e.g. as individual student, group or functional profiles. Too much rather than too little data, and flexible format for output are good principles, as it is difficult to anticipate significant but unexpected results. It may be easier to deal with some redundant data than to modify the logging program during the evaluation. Statistical tests can be applied if appropriate, and emerging patterns may feedback into an iterative courseware design process and be used for triangulation purposes, or be the focus of further evaluation.
In the final analysis, conclusions are drawn from or supported by system log data. Statistics are available for reporting purposes, and graphs, charts etc are easily produced. The data sets may also be used as a basis for comparison following modifications to courseware or to supporting environments. Graphs produced from log data show trends in a form that is easy to read and interpret. Statistical packages make this, and application of various tests, a simple task. Hint
Limitations of system log data analysis