|Route map: HomePublicationsEvaluation CookbookSelecting a Methodology|
|Selecting a Methodology|
As with any process, one of the most important steps in carrying out a successful evaluation is choosing the right way to go about doing it. If the study's design is well suited to the questions being considered, the whole process will be made considerably easier.
The surest way to get the design right is through experience. If you have already carried out evaluations, so much the better - you will already be aware of many of the problems and issues that will affect the work, and will be familiar with some of the approaches that can be used to counteract these. However, whether or not you have this experience, there will be other people who do. It is always worth talking to other evaluators when preparing a study, as their perspective will help to identify any potential difficulties in your plans. Further familiarity can be gained through reading about other peoples' studies and approaches. The recipe pages in this book provide an invaluable starting point for this process.
However, it is possible to make sensible choices without needing to become an expert in the topic first. There are a number of questions that can help to choose which methodology is best suited to the topic of your study. These can be grouped into questions about the methodology itself, about the techniques it uses for gathering data, and about how these data are analysed. It should be noted that this approach necessarily relies on generalisations. Whilst they provide a good starting point for evaluation design, practice and experience will obviously enable you to make more informed decisions.
Choosing a methodology
With a clear question in mind, it is possible to start working out which methodology you need. A good starting point is to decide how exploratory your study needs to be. In the example above, the 'what' question is highly exploratory - the evaluator has little or no idea about the factors that will influence learning. These need to be discovered in the course of the study. In the 'which' question, factors have already been found. What remains is to test them to demonstrate their influence. For this reason, the study needs to be much less explorative. Open, qualitative methodologies such as interviews, observations and concept maps tend to be best suited to explorative studies, whilst checklists and experiments require a framework for questions to be fixed in advance.
A second important question to ask is how authentic your study needs to be. When designing instructional material, it may be more appropriate (and more ethical) to test your ideas in a laboratory-like setting, rather than on students whose exam grades may be affected. However, such controlled setups are unsuitable for evaluating how to improve the way that these materials are used as part of the curriculum. Such studies require a more authentic setting. Clearly, controlled experiments are far less authentic than (for example) ethnographic studies or student profiling. Some techniques, however, can be used in both types of setting - observations are a good example of this.
Finally, it is important to be aware that the number of people who will be involved in the study will have an impact on the approach you choose. It would be impractical to carry out open interviews with 200 students, and probably inappropriate to use a comparative experimental design on a group of eight participants. Broadly speaking, the methodologies that are best suited to large groups will limit the amount of qualitative data to be gathered.
One characteristic of evaluation methodologies is the types and range of data that are collected. As with methodologies, the process of choosing the right data capture techniques can be made easier by considering a series of questions. Perhaps the simplest to ask is how objective the data that is gathered will be. If subjective information, such as attitudes and perceptions, are of interest then questionnaires and interviews are appropriate. If you need to know how students act or interact, or how their performance is affected by some change in the curriculum, data such as those contained in video logs or test results will be important. It is worth emphasising that the subjectivity of data has no impact on the rigour with which it will be analysed - it is simply a description of the type of data under consideration.
Another important quality is how focused the data will be. One reason that ethnographic studies use observations is that peripheral data, such as weather, social interactions and so on can all be taken into account if they seem to influence proceedings. A multiple-choice questionnaire, on the other hand, gives no latitude in the information that is gathered. The tight focus will make the data easier to analyse, but the down-side to this simplicity is that it means that data are also limited.
There is the practical concern of how long data gathering will take. Participant observation is extremely time intensive, as are interviews. Video recordings and questionnaires simply require setting up and gathering in, making far fewer demands on the evaluator's time.
Finally, it is worth being aware that access to resources can restrict the range of data capture techniques that can be used. The availability of screen capture software, video cameras, microphones and even an adequate photocopying budget will all need to be taken into account.
Finally, it is worth being aware that data can be presented in a range of formats, which will be appropriate for a different purpose or audience. These formats will be restricted by the type of data gathered and the methods of analysis. Qualitative data is extremely good for presenting illustrative or personalised information. It is less useful for providing summaries or overviews, however, and unlike analytical statistics, it is hard to specify how confident you are that the findings have been caused by a particular factor, or that they will generalise.
Martin Oliver & Grainne Conole
© All rights reserved LTDI and content authors.
Last modified: 26 March 1999.