Manchester Metropolitan University logo
Published by the Learning and Teaching Unit
Winter 2003
ISSN 1477-1241
Learning and Teaching in Action logo

Vol 2 Issue 1: Assessment

LTiA home page

Rachel Forsyth

The Concept of Plagiarism
Bill Johnston

Plagiarism Detection Software - a new JISC service
Rachel Forsyth

Can students assess students effectively? Some insights into peer-assessment
A. Mark Langan and and C. Philip Wheater

Exploring the potential of Multiple-Choice Questions in Assessment
Edwina Higgins and Laura Tatham

Developing a new assessment strategy
Gaynor Lea-Greenwood

Assessing the Un-assessable
Tim Dunbar

How to assess disabled students without breaking the law
Mike Wray

Returning Feedback to Students via Email Using Electronic Feedback 9
Phil Denton

Tools for Computer-Aided Assessment
Alan Fielding and Enid Bingham

Faculty Learning and Teaching Reports

Learning and Teaching News from the Library

| View this article as a .pdf file |

Gaynor Lea-Greenwood
Senior Lecturer in International Fashion Marketing
Department of Clothing Design and Technology

Developing a new assessment strategy

The assessment strategy of the BSc International Fashion Marketing course was reviewed as a part of the revalidation process. This review needed to take account of an increase in student numbers

I developed three new continuous assessments that were designed to give the students detailed feedback as well as contributing to the final grade. I also decided to finish the year with a multiple choice examination as the final summative assessment.

Some very unscientific research (asking a few of them!) had informed me that first year students were often extremely worried about the level of work expected in higher education, so assessment early into their studies and highly guided was one of the ways to help them make the transition. From experience I also knew that first year students need early indications that they are achieving the right level of work thus formative assessments seemed to be a good idea.

The final multiple choice examination proved more problematic given that I had only limited experience of this method of assessment and that within the Faculty it was hardly used at all as a summative method of assessment. Some colleagues were sceptical of multiple choice, likening it to a current popular television quiz show! However, with the right level of questioning and appropriate distracters this method can test knowledge/understanding of information and therefore provides a good means of assessing at level one. Further exploration of multiple choice testing would need to be undertaken before going beyond this but there is the suggestion that higher levels can be achieved with careful thought and planning (see the article by Higgins and Tatham elsewhere in this issue).

In addition to these philosophical issues a number of practical problems arose:

  • The questions have to be submitted in October, well before the material has been delivered.
  • There did not appear to be a mechanism for the production of answer books.
  • There was no material readily available which matched my lecture schedule, which could be adapted.
  • The question paper was not to be removed from the examination room as it is part of a bank of questions.


Writing the questions themselves

I decided to allocate the number of questions to reflect the themes of the programme, so for example ‘segmentation’ was allocated 15 questions, where a topical / general knowledge issue might be given one question. Thus I tried to get the design of the paper to fairly reflect the balance of material delivered during the year.

The distracters proved the most difficult to write though clearly were as important as the questions. The literature would suggest that the distracters should be drawn from actual examples or situations where students had misunderstood the material and should not just be made up wild guesses.

I decided upon 100 questions for no other reason than that it would convert into percentages more easily and testing on colleagues for time allocation proved that it could be answered within one hour. Internal moderation also helped eliminate any typographical or language errors and provided a useful sharing of ideas surrounding this methodology with colleagues.


The student experience

The students were briefed and given sample questions and answer books (which had now been handmade) in class. Advice was given regarding the reading and elimination of distracters and in using pencil; an eraser and confirming in ink, as any ambiguity in an answer would mean that it would not be graded.


The examination itself

A number of students arrived without a pencil!
One student answered on the question sheet rather than in the answer booklet, but managed to correct this towards the end of the exam.
All students appeared to use the full hour allocated and due to examination regulations could not leave anyway.
There were some failures but these were due to students’ lack of attendance in lectures and reflected their overall marks profile.
Passes in the formative assignments mitigated some failures.
It was interesting to note against the spreadsheet for the examination board that in the majority of cases, the multiple choice marks were close to the students’ overall profile.

The marks came out in the range of 31 to 78 with an arithmetic mean of 60.13, the mode was 61 and the median 60 respectively. The standard deviation was 12.5. I was satisfied with this distribution, the close proximity of mean, median and mode indicating no skew in the results.



As the first years come back to join the second year I plan to use the multiple choice exam as a revision tool after the summer recess as many students have asked me for ‘the answers’. I also intend to use this session to open up the debate with students as to the use of this method of assessment and their comments on the examination. This may provide useful student feedback for the M & E process.



On balance I think that this was a good exercise and well worth the effort involved in setting it up. I am pleased that from this base I began to build up a bank of multiple choice questions and distracters. This bank can now be easily maintained each year removing dated question and adding new ones. From the perspective of coping with the assessment of increased student numbers this was a worthwhile exercise but it was also a good personal development exercise for me to engage in and learn something more about the process of assessment. It also had uses beyond the original goal of summative assessment providing a useful revision tool to use with returning students.

For the future I would like to build up my bank of questions in electronic format, perhaps using the computer to randomly select the 100 questions from a bank of say 150 available. The student answer sheets can be provided in such a way as they would only require scanning in order to obtain the score. Eventually the students might just log on to the computer and enter their answers and obtain a score at the end of the exam.


Gaynor Lea-Greenwood
Department of Clothing Design and Technology
0161 247 2650


February 2003
ISSN 1477-1241

top of page