LEARNING TECHNOLOGY DISSEMINATION INITIATIVE 
Evaluation Studies 

Route map: Home Publications Evaluation StudiesDiet of Carrots 
A Diet of Carrots: Autonomy in Learning Mathematics for EconomicsRobbie Mochrie, Heriot Watt University 

Abstract
The teaching of mathematics to economics students is a core component of all degree courses. Many students report acquisition of the necessary mathematical skills and their application to economic theory to be one of the most difficult and unsatisfying parts of their courses. Teaching this material is made particularly difficult by the variation in students' prior knowledge, confidence and motivation to study this material.
At HeriotWatt University, we have started to develop computer based learning systems which we hope will eventually provide students with sufficient incentives to learn mathematical skills more effectively. In the first phase, which ran this year, we used the University's WebTest engine to write a mathematics test each week. All the test questions contained many randomised parameters, so that successive realisations would certainly be quite different. This allowed us to give students credit for successfully completing the tests, while allowing them as many attempts as necessary to pass.
Student response to the tests has generally been favourable. WebTests appear to have been successful in encouraging students to keep working at developing their mathematical skills and this is reflected in examination marks. Problems have arisen because of difficulties in inputting answers, reflecting the need for staff as well as student learning, and because of the inadequacy of feedback, The second phase of the project, which will involve the conversion of the present module lecture notes into more accessible hypertext documents, is intended to allow much more complete linkages between tests and other elements of teaching, and to increase students' autonomy in learning.
Introduction
Before commencing a discussion of my use of computer aided learning in economics, I should like to tell an anecdote that I consider to be quite instructive. A friend, who is a technician in a medical laboratory in one of the larger teaching hospitals in this country, tells me that much of the time the medical staff are unable to diagnose the cause of patients' ailments. So he is sent various samples of tissue and such like to determine whether there is anything abnormal. Treatment in such cases then reduces to alleviation of symptoms and signs and so the primary concern of the medical staff is to "stabilise" the patient's condition so that these signs are "normal". My friend's suggestion is that very often medical staff are more concerned to with ensuring normality than with patients' survival, so that a perfectly acceptable outcome appears to be death, but with normal indicators.
I find myself wondering to what extent such problems are to be found not only in medicine, but also in the appraisal and evaluation of teaching innovations. What I have just described is familiar in the literature upon student teaching, where such behaviour would be considered to reflect shallow and atomistic learning. In general, it is considered appropriate for teachers (certainly in higher education) to encourage deep, holistic learning, involving the evolution of students' conceptions and the integration of knowledge within a coherent framework. Yet here we are at this conference, pulling out from the various learning activities in which we expect our students to engage a single one, and seeking to evaluate its effectiveness in isolation. I do find myself wondering whether in doing this, we are like medical staff facing a condition that is too complicated for them to understand fully and responding to individual effects rather than analysing the system as a whole from the very start. In particular, I wish to suggest that there is an inevitable tension between shortrun success in teaching and longrun success. In short, it is not enough for us to say that an innovation has been successful because students have completed assessments satisfactorily, or even that they indicate approval of the innovation in surveys. These are means of eliciting surface information. It may be necessary to embark on action research programmes in order to identify the extent to which learning has actually taken place.
I hope that you will bear with this rather pessimistic outlook, which is the result of my being in the middle of developing an innovation. As I hope to show this afternoon, the innovation that I have developed along with Andy Crofts of the Teaching and Learning Technology Service at HeriotWatt University has been successful in many respects. I am quite certain that more students in the economics class are learning more mathematics as a result of their exposure to this innovation. Yet I am also concerned that the form of the innovation has led to students concentrating excessively upon problem solving and that this has limited their ability to transfer the skills that they develop in the environment of the computer based learning to more general economic analysis.
Having started out by stating that it is very important for us to consider the objectives of teaching in order to evaluate its effectiveness and also that it is important to consider the effects of all teaching, it seems sensible to work from the most general aspects to rather more specific ones. I suggest that all nonvocational higher education seeks to help an individual to locate a personal reality in which there is scope for conflicting opinions, but which nonetheless rests upon a set of coherent beliefs. To be slightly less obscure, learning in higher education is associated with the development of critical faculties and the ability to respond to arguments by considering the premises on which they are established and the evidence supporting them. The concepts contained in those last statements are very important and certainly controversial, Western philosophers having sought to clarify them since the time of Socrates. I therefore do not intend to attempt to unpack them further today, but instead turn my attention to the particular discipline of economics.
It seems reasonable to say that most academic economists have developed a particular way of thinking about their discipline. While disagreements among economists are sufficiently famous to have been recorded in witticisms such as "If all the economists were laid end to end, they wouldn't reach a conclusion," nonetheless such disagreements tend to relate to controversial areas at the frontiers of the discipline. In the areas where undergraduates are studying, there seems likely to be much greater consensus, certainly concerning the ways in which questions might be approached. The starting point of most economics is the assumption of rational selfinterest, an instrumental concept of rationality in which people seek to achieve their own ends, however those might be defined. In addition, economists ordinarily assume that demands are insatiable and that markets work in allocating scarce resources among these demands. The discipline that develops from the interconnection of these core beliefs is based on the study of equilibrium within elegant models for which precise mathematical representations exist and numerical approximations abound.
There is a problem here. Students who do not possess the ability to understand the algebraic representation of simple models and the analysis that leads to the derivation of the equilibrium for that model will find it very difficult to follow the arguments presented at the higher level of a degree course. Equally, students' introduction to economics does not prepare them fully for this treatment of the subject. Let's consider just how quickly this change takes place. On the left we see a schematic presentation of one of the fundamental building blocks of economic analysis, the problem of how a consumer chooses between two goods in order to maximise utility, as it would be shown to first level students. Central to the presentation is the diagram, and surrounding the diagram is solid, dense text. For the student to understand the material, the student has to be able to understand what the various curves and lines represent and interpret the relationships between them. Now I do not want to denigrate such an approach, for as recently as fifty years ago, most students probably would not have needed to undertake any more sophisticated analysis at any stage of their studies and there are undoubtedly many sophisticated results that can be obtained from this form of analysis.
But now let us look at the figure on the right, which is a schematic representation again, this time of the first chapter of one of the more advanced of second level microeconomics textbooks. Now I am cheating somewhat here, for on the preceding page, there is indeed a similar, if briefer, analysis based around the diagrammatic representation. But I think that it is instructive nonetheless that the authors of this book from its very start assume that it is appropriate to present a problem of constrained optimisation, assuming understanding of the Lagrangean multiplier technique of solving this problem as well as understanding of the concept of the total differential, matters that would not be familiar to the majority of students whose study of mathematics had ended at Alevel. Such a presentation is problematic. Imagine for a moment that you are a second year undergraduate who considered passing GCSE mathematics to be something of an achievement and escape from mathematics at Alevel to have been a great relief. Suddenly you are presented with arguments such as this. My feeling is that your perplexity would lead to anxiety or even to panic. Your response would either be to try to learn the equations symbol by symbol, as though mere reproduction would be a good enough imitation of learning to pass an examination; or else you would pass on as quickly as possible to the point where the mathematical form of argument concluded. Although the latter is much more likely, the former certainly occurs. Equally, suppose that you are a student with a strong mathematical bent, but who has not previously met the Lagrangean multiplier condition. Here there is another danger, that you work through the mathematics listlessly or independently. Noting the effort that your teacher makes to drill others in the class in the technical aspects of the argument, you do not develop the economic logic that underlies the technical argument. Being honest, this was my experience. The time that I first really had to confront this material and understand how all the components of it locked together was when I came to write lecture notes upon it. Which does raise some interesting questions: in particular, if it is possible for a capable student to pass through a first degree with a severely impaired understanding of such key concepts, how do weaker students fare?
These comments provide an indication of the difficulties faced by an economics lecturer attempting to teach mathematics. I have described above paths that are likely to lead to surface learning, rather than deep learning, so that students grasp facts and do not perceive concepts. In addition, however, we must consider students' orientation. I suggest that for most students, the only source of motivation to study this material is academic: the social, the vocational and the personal are likely to be absent, even though econometricians (economists who use statistical analysis) are probably one of the best paid groups of professionals in the country. More importantly, however, I suspect that most students see mathematics as an external imposition. It is one of the hoops through which they must jump in order to complete the requirements of their course, not something that they wish to master chiefly in order to understand it, or even because they see it as being an essential building block in their understanding of economics. Even worse, there are some students for whom mathematics is simply something that is happening "out there", who avoid contact with it as far as possible, whose prejudice against their own understanding is so great that their orientation towards the material never really develops.
Implementation
What we are trying to do at HeriotWatt is to provide students with incentives to develop their orientation towards the principles of mathematics. This is still work in progress and our thinking has developed as students have interacted with the new material and we have had the opportunity to reflect on their performance. We started by identifying the problem as being one of students not developing sufficient familiarity with essential techniques early on, so that confronted with economic applications of these techniques from which important principles of economics could be deduced, they were unable to respond effectively. We thought that it was likely that the students were having difficulty in seeing the woods from the trees. For example, a student who is expending considerable effort on obtaining the reduced form of a system of equations is probably going to be sufficiently exhausted by that effort as to be unable to perceive economic interpretation of the reduced form parameters. We concluded that students will find many ways of avoiding mastery of these techniques unless they were in some way compelled to confront this material at an appropriate time.
Economists tend to be fully paid up behaviourists. It therefore seemed perfectly reasonable to us that students would only do what we wanted them to do if we provided them with adequate incentives in the assessment scheme. We wanted to assess students regularly, but without making the assessment procedure threatening. Continual examinations, quite apart from requiring resources that would not be available, would therefore be impractical. Setting course work assessments completed without supervision permits collusion and direct copying that is very difficult to detect. However, it did seem possible to use the WebTest engine developed by Maureen Foster in the Department of Mathematics at HeriotWatt University. Relying on the programming skills of Andy Crofts, the development officer in TLTS, we set up a sequence of fourteen tests, which the students completed over the first two terms of this academic year, all of them introducing students to application of mathematics to economics.
What we particularly liked about the WebTest engine was its versatility. Students only required to be able to access a web site in order to be able to generate a test. Because questions contained multiple random parameters, there were generally thousands of possible variants to each of the questions that might be asked, so each test generated was almost certain to be different from all of the other ones. Generally answers were numeric, so that students had to perform a calculation, usually one involving two or three steps and then type the appropriate numbers into the boxes on the form. However, it was also possible to use multiple choice questions and the variety of formulae that we could use increased as the engine was developed over the period that we used it. Students completed the tests by typing in the text boxes on the form. Having completed a test, it could be submitted for marking immediately and the score returned to the students on a form that had the correct answer together with hints concerning the way to find the solution.
Each test required the students to answer ten questions. In order to encourage students, these questions were of differing difficulty and it was generally possible to answer the first two or three with only the most basic knowledge. However, in order to encourage students to keep working at the material, we used a binary marking scheme. In order to pass a test, a student had to answer seven out of the ten questions in a particular test correctly. Students who passed received a credit of 1.5% of the mark for the module to which the test was attached, so that it was possible, without fully completing the tests, to obtain the marks for the test. However, the mark of seven out of ten was intended to ensure that students earned their marks, demonstrating a good understanding of the material covered in the process. To encourage students to take risks, they were allowed as many opportunities as they required to attempt the test. Most sought no more than three unsuccessful ones, with marks tending to increase quite quickly to the pass mark. Over half of the students completed all of the tests in the second term and this ensured that they entered the final examinations able to perform quite badly, yet still pass. As we shall see, this was just as well.
At first, we set ten separate questions, but as time went on, we were able to ask questions that had multiple parts and which examined a particular question of economics in a number of ways. Throughout, the questions sought to apply mathematical techniques to economic analysis in an attempt to encourage students to look beyond the manipulations and to reflect upon what they were doing.
Andy and I worked at very high speed in order to prepare the number of tests required. This led to a number of technical glitches and on one occasion to my placing on the Web a test which had a major error in one of the answers. However, this was noticed almost immediately by some of our more diligent students and rectified very quickly. We were surprised by the infrequency with which comments were directed to us concerning the tests and certainly Andy Crofts had found other classes much more demanding in seeking guidance and assurance about the use of of the tests. To some extent, we attributed this to the tests being used right from the very start of the academic year, so that students simply saw this as being the way in which they were to study mathematics. Other projects that had used WebTests had only introduced them as pilots during the middle of modules, so that students might already have developed approaches to learning the material that were not consistent with the methods expected by the developer of the WebTest.
Evaluation
The impression that we formed during the working of the tests was that students were broadly satisfied with them. When we carried out a survey of the students early in the second term, we found a more mixed response than we had expected. Dealing with a class where the majority of students had not studied mathematics beyond GCSE, it should have come as little surprise to us that relative few of the students were regular computer users. The WebTests were seen as being very flexible to use, although only a small number obtained access from outside the university. Twenty eight of the forty students surveyed had completed all or most of the tests, which suggests that the sample was biased slightly towards the better students. Most students claimed to be spending three hours or so on the tests, which was about the level expected and requiring two or three sessions to complete the test, which was also pleasing since it suggested that students were having to work quite hard in order to complete the tests, but were managing this in what seemed to be a reasonable time. There was considerable evidence of group working, which is commendable, although quite a number of students did specifically mention that having worked with others to work out how to answer the questions, they then completed the tests on their own, which is probably the ideal. The preferred mode of working therefore appears to be to generate a test, submitting it blank to obtain answers and hints, printing these off and taking them away to work on. To some extent, this may have led to attempts at backward induction, with students knowing what the answer was to a particular question to working back from the answer to the question.
We found that there was concern over the lack of feedback with the answers. All students learned from submitting a wrong answer to a question was that they had answered it wrongly. They did not receive any hints concerning the ways in which it was wrong and with many of them determined to continue working at questions until they solved them, there was some feeling that they were working too long at the tests. There was also some feeling that questions were badly worded and concern about the "mistakes" in the tests. This puzzled us, since most of the questions had been carefully designed and tested by us before being made available to the students and certainly contained no errors so far as we could see. It turned out that the problem was to do with the formulation of the questions and the ways in which answers had to be submitted, which appeared to be insufficiently flexible. Insofar as was possible, we sought to remedy these matters in the remaining weeks of the course and later exercises suggests that students found the relatively simple change that were made to be quite helpful in using the tests.
By the end of the course, most students agreed that WebTests had helped them to work more consistently, considered computer aided assessments to be a useful way of exploring a subject, and that the WebTest had helped them in applying mathematics to economics, but were still concerned by the limited feedback. They considered that the most effective technique was to print off a test and then to work as part of a group in order to obtain the answers, reckoned that it was important to keep up with the tests and other work in the course, rather than trying to complete them in a rush at the end of term, but thought that it would be much easier having a maths Alevel in the first place. So, all seemed to being well. The tests had been completed, and most students had obtained very satisfactory credits for completing the material. This suggested that whereas students had in the past failed to continue working at the material throughout the two terms, they had been persuaded, possibly by the lure of easy marks, to do so in this way. It was therefore very disappointing for us to mark the examinations at the end of the second term and to discover that the majority of students had done very poorly in the mathematics questions. In all too many cases, there was little evidence of any learning at all, and there appeared to be very little difference between the results this year and in previous years. We have explored some of the reasons for this with Prof. John Cowan, formerly of the Open University in Scotland. He has suggested a methodological flaw in the work as it stands and is concerned that there is too much emphasis upon numerical testing and problem solving. In other words, we give students marks during the year for learning and applying one set of skills, but these are only a subset of the ones that we wish them to acquire. Professor Cowan's suggestion is that we should link the WebTests more explicitly to the rest of the course and that we should encourage students to abstract from what they are learning in the WebTests, so that they are able to identify concepts that they are able to use elsewhere. The proposal that he has made is that we should allow students to prepare cards on which they would write down guides to techniques of problem solving, allowing them to take these into examinations. In other words, we would recognise the limitations of the WebTests. Excellent though they are for encouraging students to keep on working away at particular kinds of problem, they are probably insufficiently flexible by themselves to encourage the reflection necessary for deep learning. As a tool, they have already proven themselves, increasing students' independence and in many ways also their effectiveness as learners. As we integrate other material with them, particularly a course hypertext, they will no doubt become even more valuable resources. Indeed, the lesson to take from this experience may be that in order to make such a substantive innovation, it is not possible simply to bolt it on to the course as we did with the WebTests, but to consider much more carefully how it relates to other components of the course. Given that the students are likely to do most of their learning in the context of these tests, other aspects of the course, such as lectures, tutorials and even examinations should perhaps be redesigned with the use of WebTest in mind.
The URL of the web site that was used for setting up the test is http://flexlearn.ma.hw.ac.uk
To contact the maintainers  mail (ltdi@icbl.hw.ac.uk)
HTML by Phil Barker
© All rights reserved LTDI.
Last modified: 29 September 1998.