Manchester Metropolitan University logo
Published by the Learning and Teaching Unit
Winter 2003
ISSN 1477-1241
Learning and Teaching in Action logo

Vol 2 Issue 1: Assessment

LTiA home page

Rachel Forsyth

The Concept of Plagiarism
Bill Johnston

Plagiarism Detection Software - a new JISC service
Rachel Forsyth

Can students assess students effectively? Some insights into peer-assessment
A. Mark Langan and and C. Philip Wheater

Exploring the potential of Multiple-Choice Questions in Assessment
Edwina Higgins and Laura Tatham

Developing a new assessment strategy
Gaynor Lea-Greenwood

Assessing the Un-assessable
Tim Dunbar

How to assess disabled students without breaking the law
Mike Wray

Returning Feedback to Students via Email Using Electronic Feedback 9
Phil Denton

Tools for Computer-Aided Assessment
Alan Fielding and Enid Bingham

Faculty Learning and Teaching Reports

Learning and Teaching News from the Library

| View this article as a .pdf file |

Alan Fielding and Enid Bingham
Biological Sciences

Tools for Computer-Aided Assessment


Computer Aided Assessment (CAA) refers to the use of computers to assess students. Although this definition could include optical mark readers, this review is restricted to assessments completed by the student at a computer without the intervention of an academic. This review is not concerned with the design or relevance of CAA as assessment tools. Interested readers can find many (too many!) relevant resources on the web (see Box 1 for some suggested starting points). Assuming that you have decided to incorporate some CAA into your assessment programme what CAA resources are available? Four packages of various degrees of sophistication are described (Box 2 has contact details). There are many others such as Hot Potatoes ( which may be better or worse than the four described in this review. For example, Hot Potatoes is a particularly nice, free (to education) tool for the generation of formative CAA.

Box 1. Some suggested web resources that address general aspects of CAA.
Title Web Address
CAA Tools, Resources and Articles
CAA workshop
CAA centre
Designing and Managing Multiple Choice Questions
Good question design for objective testing
Advice for students taking MCQ tests


Box 2. Contact details for the software covered by this review
Product url Notes
WebCT The MMU implementation can be accessed at
Question Tools  
PracticeMill UK distributor and download from (Kovach Computing)


Biographical notes and caveats

Both authors have more than 25 years teaching experience at all levels from foundation to postgraduate. In the mid 1980s both of us took advantage of career breaks to study for an MSc in Biological Computation at the University of York. Our MScs gave us strong IT skills that we have subsequently used in both teaching and administrative roles. Although forgetfulness comes with increasing age it is difficult to recognise the constraints that less IT-capable staff would experience when using these CAA tools. Inevitably there is a trade-off between product complexity and the skills needed to fully exploit their potential.

It is always difficult to make fair and comprehensive comparisons between different pieces of software, particularly when you begin with considerable experience of, and opinions about, one or more of the products under test. One of us (AF) has five years experience with WebCT (Biological data processing for second year undergraduate students and three MSc courses) while the other (EB) has spent over 12 months developing real assessments (PgC Biomedical Sciences course) using PracticeMill. Unfortunately it was not possible to test Question Tools and Questionmark under real teaching situations. This means that we may have missed some of their important advantages and disadvantages that only become apparent in real world situations. It is only when software is used with real problems that the nuances and annoyances of software design come to light. However, our experience did enable us to identify features that we considered important. For example, question portability is essential, few of us have the time to spend hours retyping questions into a different format.

The products

There is an important distinction between assessments that immediately transmit student responses or performance to a central resource and those that are completed on a stand alone, non-networked, computer. Because it is simpler to ‘police’ and manage assessments on networked computers we would normally restrict assessments using stand alone computers to formative and self-diagnostic measures. These products all have some potential to run assessments from secure servers. However, this protection comes at a cost, which may, in addition to extra purchase or licence costs, include significant server management time and skills.

WebCT ( is the MMU implementation) is a large VLE managed for academics by staff in the Learning & Teaching unit, particularly Rachel Forsyth and Robert Ready. Part of the WebCT functionality are the CAAs or quizzes. You have a great deal of control over the delivery settings for a WebCT CAA. For example, you can:

  1. allow students between 1 and an unlimited number of attempts;
  2. restrict access to specific time windows;
  3. restrict access by a variety of criteria including user id, previous performance and IP address;
  4. protect the assessment by a password, for example a password only given out at one time in a particular room;
  5. determine the amount of feedback.

There are five question formats but only four can be used for ‘pure’ CAA. ‘Paragraph’ questions allow students to enter free text, but this must be marked by an academic. Although images can be included in questions they are not ‘active’, unlike the options available in some Questionmark and Question Tools Editor questions. The remaining four question types are:

  1. Multiple Choice: students select one or more correct answers to a question. Each answer can have its own mark (positive or negative).
  2. Matching questions ask students to match two lists of items into pairs. WebCT arranges answers randomly when the question is presented to the students.
  3. Calculated: students answer a mathematical question. You specify a calculation as an equation with one or more variables. After minimum and maximum values have been specified for each variable WebCT will generate a set of randomly selected values so that students see different values for their individual question.
  4. Short Answer: students answer by entering the correct short answer as a single word or short phrase. You must supply a list of possible correct answers, and whether the answer(s) are to be case sensitive. However, unless answers are unambiguous there is scope for student unrest! For example, if the answer to a question is the ‘Atlantic Ocean’, will you allow ‘the Atlantic’, ‘the atlantic ocean’, ‘atlantic’, etc?

One criticism of WebCT is that questions are generally written using the WebCT interface. In addition to requiring an internet connection the response times can be slow if the network and/or server are busy. An alternative is to use the stand alone Windows program Respondus ( to design questions which can then be uploaded to the WebCT server. Respondus is not free but may be worth considering since it is easy to use and can export questions in a range of formats (at least for the just released version 2).

Questionmark Perception is widely used in industry and many universities. It is available in two formats: Perception for Windows and Perception for Web. As the names suggest, Perception for Windows is used to create, deliver and report on assessments using a Windows PC while Perception for Web, “our most popular package”, is used to administer assessments using the Internet or Intranets. Perception can create questions in a number of different formats and place them into topics and sub-topics. Topics can be assigned "Topic Outcomes" so that topic based feedback can be provided based on the score achieved for that specific topic. The available question types depend on the authoring tool used. The Windows version has more question types than the browser-based authoring tool (Box 3)

Box 3 Question types in Questionmark’s Perception software

Perception for Windows authoring only

  • Drag-and-Drop: dragging and dropping up to ten graphics into position.
  • Fill-in-the-blank: a statement is presented where one or more words are missing. The score can be determined from checking each blank against a list of acceptable words.
  • Hotspot: select by clicking on a picture. A graphics editor is provided to simplify specifying the choice areas.
  • Drag and Drop: a participant clicks and drags up to ten images into position. Feedback and score is dependant upon the final position of the images.
  • Java: an interface to Java allow programmers to program customized items using Java and have the results recorded within the answer database.
  • Macromedia Flash: Perception supports an interface to Macromedia Flash to allow programmers to program customized items using Flash and have the results recorded within the answer database.
  • Matching: two series of statements/words are presented and the participant must match items from one list to items within the other list.
  • Matrix: several multiple choice questions are presented together and the participant selects one choice for each statement or question presented.
  • Pull Down List (selection question): a series of statements are presented and the participant can match these statements with a pull-down list.
  • Ranking (Rank in Order): a list of choices must be ranked numerically with duplicate matches not allowed.
  • Select-a-blank: a statement is presented with a missing word, possible words can be selected from a pull down list to indicate their answer.

Both authoring tools

Essay question: up to 30,000 characters of text can be entered (not computer-marked).

Likert scale: selects one choice from choices such as "strongly agree" through "strongly disagree" that are weighted with number to aid analysis of the results.

Multiple choice: select one choice from up to 40 possible answers. There is no limit to the length of each answer.

Multiple response: similar to multiple choice except the participant is not limited to choosing one response.

Numeric questions: a numeric value must be entered and this may be scored as one value for an exact answer and another score if the response is within a range.

True/False: the participant selects "true" or "false" in response to the question.

Yes/No: the participant selects "Yes" or "No" in response to the question.

Word response (text match): the participant types in a single word, or a few words to indicate their response. You define right or wrong words or phrases in advance by entering a list of acceptable answers. The grading logic can also allow scoring based on the presence or absence of keywords or key phrases and check for mis-spellings.


Perception’s web-based tools are controlled from the Enterprise Manager home page. The three main areas of this comprehensive package are summarised in Figure 1.

Figure 1. The Questionmark Perception Enterprise Manager structure

The Questionmark applications are very sophisticated, professional tools for the design, implementation and management of CAA. However, the pricing models suggest that these are unlikely to be departmental purchases. Indeed their purchase could probably only be justified at Faculty or Institutional level. Since WebCT is already available universally across MMU it is difficult to imagine a scenario under which the additional expenditure and server management needs could be justified unless the WebCT licence was terminated. Although the WebCT assessment tools are more limited than those in the QuestionMark products, the extra VLE functionality of WebCT probably gives it an advantage at this point in time.

Question Tools has five modules, two of which are free (but must be registered). The publishers state that they have never charged for these free products and never will. The modules are:

SimpleSet A simple, free question editor that can create tests delivered as web pages or as ‘exams’.
Exam A free, secure alternative to web delivery for tests.
Editor The professional version of the question editor with a wider range of options and a graphical interface for positioning question elements. It is possible to use a free version of the Editor to create small tests.
Results Analyser Produces summaries of test performances.
Server A delivery and management tool for tests delivered across an intranet or the internet.


In addition Question Tools offers a test ‘hosting’ service which allows users to upload tests on to their server for test delivery. All of the server-delivered options provide greater security and problem-recovery facilities. The question types are more restricted than the other products but include the common types.

Question type Notes
Select/Multiple choice Students select the correct answer from a list (2 – 4 choices). In SimpleSet only one correct answer is allowed. Multiple correct answers with variable marks are allowed with Question Tools Editor.
True/False Students rate statements as true or false. 1 – 4 buttons are allowed per question.
Answer Fields Short text answers are evaluated. Multiple correct answers are allowed. 1 – 4 text fields are allowed per question.
Additionally using Question Tools Editor
Drag Question Student drags answers into correct positions.


Feedback, sound and graphics are only available with questions created using Question Tools Editor.

There are four options available for the delivery of tests: web pages; Question Tool Exam; the Question Tools local server or the Question Tools host server. The web pages are simple to create and do not require any special scripts so they should work from most servers but they only work with Internet Explorer. Question Tool Exam is a small, freely distributable program. It is more secure than the browser-based tests because it is harder to close, and uses a secure encryption method. Question Tools Exam can also collect group results over a local area network which can then be analysed in conjunction with Question Tools Results Analyser. The Exam software also includes support for users with visual and hearing impairments.
Despite the two free pieces of software, a complete installation at departmental or institutional level is quite expensive. For example, an installation with 100 Editor and Results Analyser licences plus a 12 month server licence would be almost £10,000.


Practice Mill is a simple and relatively cheap program for creating MCQ tests. Indeed the £185 price for a multi-user site licence is on a par with some single-user licences. It consists of a Question Editor that is used to create tests that can be delivered as web pages, ‘exams’ or via the proprietary ‘PMViewer’. PMViewer is a stand-alone, small program that can be freely distributed. It presents the tests in the form of a ‘stack of cards’ and can be run from a PC hard drive, a floppy disc or the student network. The amount and type of feedback can be set and the editor can also be used to collect student performance data and analyse responses to individual questions. A User Accounts Manager allows user lists and passwords to be created to control access to appropriate tests.

It is very easy to add and edit questions in the Practice Mill Editor. After 10 minutes instruction an administration assistant was confident enough to enter 6 tests each containing 20 questions in a very short time. There are limitations, of course, some quite severe. One of them is that only two types of question are allowed, short answers and single response multiple choice and these cannot be combined in the same test. One way of overcoming the single answer constraint is to restructure the question (See Box 4). This has the additional advantage of making them more challenging for postgraduate students. A further serious limitation for scientists and mathematicians is that superscripts, subscripts and symbols are not available although images (e.g. gif files) can be used to overcome these problems to some extent. The length of text lines, especially in the answer choices, is limited if the test is used with the Viewer.

We have a group of part-time PgC students who have to do, as part of their assessment, 6 MCQ tests. They seemed an ideal group to try out Practice Mill and to replace the paper based tests with CAA. Although it is possible to create web based tests there are problems. The directory that contains the tests has to have both read and write access if student performances are to be recorded. This was too insecure to risk on our departmental server. It is possible to set up the web based test to e-mail results on completion. We had a trial run at this. Unfortunately it was difficult to extract information from the resulting e-mails and as the total number expected was around 200 this didn’t seem to be a practical option either. Instead we used individual practice discs presenting the tests through the Viewer. As the students perform tests log files keep a record and the students, who have very varied attendance patterns, either e-mail their log files or bring in their discs from time to time to have their marks recorded. This is not the most efficient process but it is not too time consuming with a group of this size (35 students).

Box 4 A multi-answer MCQ reformatted as a single answer question.

1. Normal flora can

  1. cause bacteremia
  2. spread to sterile parts of the body if the environment changes
  3. cause disease in immune-suppressed patients
  4. include pathogenic bacteria
  5. prevent infection on some occasions

The answer is:-

  1. 1,2 + 3
  2. 2, 4 + 5
  3. 1,3,4 + 5
  4. all of these
  5. 3, 4 + 5


Some students found it difficult to read the Viewer cards on their screens, possibly because of their monitor configuration, but apart from this one of the criticisms of the Viewer is that it is visually rather crude and unattractive (see Figure 2),

Figure 2. A sample question in the free PM Viewer. The same question can be seen in an html format in the sample web test.

Some formative self-assessment tests for other groups of students have been trialed for web delivery, an example can be found at Note that this will run in Explorer but not Netscape and may initially appear to be unavailable. Click Go and it should run!

John O’Neill has recently set up a directory in the biology student area so that we can do assessment MCQ tests in a terminal classroom with a group of students. They will run the test from a shared directory and we should be able to collect the class results immediately and to review the student performance on individual questions. Watch this space and LT2003 (the annual Faculty learning and teaching event)!


If you already have MCQs available in one format, perhaps just simple text, it is obviously advantageous if they can be moved between the different formats used by each CAA tool. Undoubtedly Question Tools fails in this respect. Although it is possible to cut and paste small pieces of text into the question construction boxes this is unsatisfactory for more than a few questions. Similarly, if you are starting with a blank sheet there is no simple way to export questions from Question Tools into other applications. The other three systems all have some import and export capability. For example, WebCT questions can be imported and exported as simple text files, although there is a rigid structure to the question format. A PracticeMill test can be exported to a text file in a format that enables it to be imported into Respondus with very little editing and then it could be transferred to WebCT.

If you want to explore CAA without a major commitment of time or resources then it is probably wise to begin by investigating free systems such as Question Tools (and the untested Hot Potatoes). Practice Mill could also be a useful starting point for developing CAA. It is relatively easy to set up a bank of simple MCQs that can be presented to students for self-assessment in a variety of ways. All staff in the Science and Engineering Faculty can have a full individual copy via our site licence. All of the web sites of the reviewed software allow you to run through sample tests and all, except WebCT, can be downloaded and tested as time-limited trials. Alternatively you could ask one the 120 colleagues who are already using WebCT, to share their experiences with you. The Learning & Teaching Unit explain how WebCT is used within MMU on their web site ( The WebCT training programme is available at, and includes sessions on using WebCT for assessment.

Alan Fielding
0161 247 1198

Enid Bingham
0161 247 1199


February 2003
ISSN 1477-1241

top of page