The Concept of Plagiarism
Plagiarism Detection Software - a new JISC service
Can students assess students effectively? Some insights into peer-assessment
Exploring the potential of Multiple-Choice Questions in Assessment
Developing a new assessment strategy
Assessing the Un-assessable
Tools for Computer-Aided Assessment
| View this article as a .pdf file |
Alan Fielding and Enid Bingham
|Box 1. Some suggested web resources that address general aspects of CAA.|
|CAA Tools, Resources and Articles||http://www.ulst.ac.uk/cticomp/CAA.html|
|Designing and Managing Multiple Choice Questions||http://www.unn.ac.uk/academic/ss/psychology
|Good question design for objective testing||http://www.ltss.bris.ac.uk/in20p04.htm|
|Advice for students taking MCQ tests||http://www.psych.ucalgary.ca/CourseNotes/
|Box 2. Contact details for the software covered by this review|
|WebCT||http://www.webct.com/||The MMU implementation can be accessed at http://odl.mmu.ac.uk/|
|PracticeMill||http://www.simstat.com/home.html||UK distributor and download from http://www.kovcomp.com/ (Kovach Computing)|
Biographical notes and caveats
Both authors have more than 25 years teaching experience at all levels from foundation to postgraduate. In the mid 1980s both of us took advantage of career breaks to study for an MSc in Biological Computation at the University of York. Our MScs gave us strong IT skills that we have subsequently used in both teaching and administrative roles. Although forgetfulness comes with increasing age it is difficult to recognise the constraints that less IT-capable staff would experience when using these CAA tools. Inevitably there is a trade-off between product complexity and the skills needed to fully exploit their potential.
It is always difficult to make fair and comprehensive comparisons between different pieces of software, particularly when you begin with considerable experience of, and opinions about, one or more of the products under test. One of us (AF) has five years experience with WebCT (Biological data processing for second year undergraduate students and three MSc courses) while the other (EB) has spent over 12 months developing real assessments (PgC Biomedical Sciences course) using PracticeMill. Unfortunately it was not possible to test Question Tools and Questionmark under real teaching situations. This means that we may have missed some of their important advantages and disadvantages that only become apparent in real world situations. It is only when software is used with real problems that the nuances and annoyances of software design come to light. However, our experience did enable us to identify features that we considered important. For example, question portability is essential, few of us have the time to spend hours retyping questions into a different format.
There is an important distinction between assessments that immediately transmit student responses or performance to a central resource and those that are completed on a stand alone, non-networked, computer. Because it is simpler to ‘police’ and manage assessments on networked computers we would normally restrict assessments using stand alone computers to formative and self-diagnostic measures. These products all have some potential to run assessments from secure servers. However, this protection comes at a cost, which may, in addition to extra purchase or licence costs, include significant server management time and skills.
WebCT (http://odl.mmu.ac.uk is the MMU implementation) is a large VLE managed for academics by staff in the Learning & Teaching unit, particularly Rachel Forsyth and Robert Ready. Part of the WebCT functionality are the CAAs or quizzes. You have a great deal of control over the delivery settings for a WebCT CAA. For example, you can:
There are five question formats but only four can be used for ‘pure’ CAA. ‘Paragraph’ questions allow students to enter free text, but this must be marked by an academic. Although images can be included in questions they are not ‘active’, unlike the options available in some Questionmark and Question Tools Editor questions. The remaining four question types are:
One criticism of WebCT is that questions are generally written using the WebCT interface. In addition to requiring an internet connection the response times can be slow if the network and/or server are busy. An alternative is to use the stand alone Windows program Respondus (http://www.respondus.com/) to design questions which can then be uploaded to the WebCT server. Respondus is not free but may be worth considering since it is easy to use and can export questions in a range of formats (at least for the just released version 2).
Questionmark Perception is widely used in industry and many universities. It is available in two formats: Perception for Windows and Perception for Web. As the names suggest, Perception for Windows is used to create, deliver and report on assessments using a Windows PC while Perception for Web, “our most popular package”, is used to administer assessments using the Internet or Intranets. Perception can create questions in a number of different formats and place them into topics and sub-topics. Topics can be assigned "Topic Outcomes" so that topic based feedback can be provided based on the score achieved for that specific topic. The available question types depend on the authoring tool used. The Windows version has more question types than the browser-based authoring tool (Box 3)
Box 3 Question types in Questionmark’s Perception software
Perception for Windows authoring only
Both authoring tools
Essay question: up to 30,000 characters of text can be entered (not computer-marked).
Likert scale: selects one choice from choices such as "strongly agree" through "strongly disagree" that are weighted with number to aid analysis of the results.
Multiple choice: select one choice from up to 40 possible answers. There is no limit to the length of each answer.
Multiple response: similar to multiple choice except the participant is not limited to choosing one response.
Numeric questions: a numeric value must be entered and this may be scored as one value for an exact answer and another score if the response is within a range.
True/False: the participant selects "true" or "false" in response to the question.
Yes/No: the participant selects "Yes" or "No" in response to the question.
Word response (text match): the participant types in a single word, or a few words to indicate their response. You define right or wrong words or phrases in advance by entering a list of acceptable answers. The grading logic can also allow scoring based on the presence or absence of keywords or key phrases and check for mis-spellings.
Perception’s web-based tools are controlled from the Enterprise Manager home page. The three main areas of this comprehensive package are summarised in Figure 1.
Figure 1. The Questionmark Perception Enterprise Manager structure
The Questionmark applications are very sophisticated, professional tools for the design, implementation and management of CAA. However, the pricing models suggest that these are unlikely to be departmental purchases. Indeed their purchase could probably only be justified at Faculty or Institutional level. Since WebCT is already available universally across MMU it is difficult to imagine a scenario under which the additional expenditure and server management needs could be justified unless the WebCT licence was terminated. Although the WebCT assessment tools are more limited than those in the QuestionMark products, the extra VLE functionality of WebCT probably gives it an advantage at this point in time.
Question Tools has five modules, two of which are free (but must be registered). The publishers state that they have never charged for these free products and never will. The modules are:
|SimpleSet||A simple, free question editor that can create tests delivered as web pages or as ‘exams’.|
|Exam||A free, secure alternative to web delivery for tests.|
|Editor||The professional version of the question editor with a wider range of options and a graphical interface for positioning question elements. It is possible to use a free version of the Editor to create small tests.|
|Results Analyser||Produces summaries of test performances.|
|Server||A delivery and management tool for tests delivered across an intranet or the internet.|
In addition Question Tools offers a test ‘hosting’ service which allows users to upload tests on to their server for test delivery. All of the server-delivered options provide greater security and problem-recovery facilities. The question types are more restricted than the other products but include the common types.
|Select/Multiple choice||Students select the correct answer from a list (2 – 4 choices). In SimpleSet only one correct answer is allowed. Multiple correct answers with variable marks are allowed with Question Tools Editor.|
|True/False||Students rate statements as true or false. 1 – 4 buttons are allowed per question.|
|Answer Fields||Short text answers are evaluated. Multiple correct answers are allowed. 1 – 4 text fields are allowed per question.|
|Additionally using Question Tools Editor|
|Drag Question||Student drags answers into correct positions.|
Feedback, sound and graphics are only available with questions created
using Question Tools Editor.
There are four options available for the delivery of tests: web pages;
Question Tool Exam; the Question Tools local server or the Question Tools
host server. The web pages are simple to create and do not require any
special scripts so they should work from most servers but they only
work with Internet Explorer. Question Tool Exam is a small, freely
distributable program. It is more secure than the browser-based tests
because it is harder to close, and uses a secure encryption method. Question
Tools Exam can also collect group results over a local area network which
can then be analysed in conjunction with Question Tools Results Analyser.
The Exam software also includes support for users with visual and hearing
Despite the two free pieces of software, a complete installation at departmental or institutional level is quite expensive. For example, an installation with 100 Editor and Results Analyser licences plus a 12 month server licence would be almost £10,000.
Practice Mill is a simple and relatively cheap program for creating MCQ tests. Indeed the £185 price for a multi-user site licence is on a par with some single-user licences. It consists of a Question Editor that is used to create tests that can be delivered as web pages, ‘exams’ or via the proprietary ‘PMViewer’. PMViewer is a stand-alone, small program that can be freely distributed. It presents the tests in the form of a ‘stack of cards’ and can be run from a PC hard drive, a floppy disc or the student network. The amount and type of feedback can be set and the editor can also be used to collect student performance data and analyse responses to individual questions. A User Accounts Manager allows user lists and passwords to be created to control access to appropriate tests.
It is very easy to add and edit questions in the Practice Mill Editor. After 10 minutes instruction an administration assistant was confident enough to enter 6 tests each containing 20 questions in a very short time. There are limitations, of course, some quite severe. One of them is that only two types of question are allowed, short answers and single response multiple choice and these cannot be combined in the same test. One way of overcoming the single answer constraint is to restructure the question (See Box 4). This has the additional advantage of making them more challenging for postgraduate students. A further serious limitation for scientists and mathematicians is that superscripts, subscripts and symbols are not available although images (e.g. gif files) can be used to overcome these problems to some extent. The length of text lines, especially in the answer choices, is limited if the test is used with the Viewer.
We have a group of part-time PgC students who have to do, as part of their assessment, 6 MCQ tests. They seemed an ideal group to try out Practice Mill and to replace the paper based tests with CAA. Although it is possible to create web based tests there are problems. The directory that contains the tests has to have both read and write access if student performances are to be recorded. This was too insecure to risk on our departmental server. It is possible to set up the web based test to e-mail results on completion. We had a trial run at this. Unfortunately it was difficult to extract information from the resulting e-mails and as the total number expected was around 200 this didn’t seem to be a practical option either. Instead we used individual practice discs presenting the tests through the Viewer. As the students perform tests log files keep a record and the students, who have very varied attendance patterns, either e-mail their log files or bring in their discs from time to time to have their marks recorded. This is not the most efficient process but it is not too time consuming with a group of this size (35 students).
Box 4 A multi-answer MCQ reformatted as a single answer question.
1. Normal flora can
The answer is:-
Some students found it difficult to read the Viewer cards on their screens, possibly because of their monitor configuration, but apart from this one of the criticisms of the Viewer is that it is visually rather crude and unattractive (see Figure 2),
Figure 2. A sample question in the free PM Viewer. The same question can be seen in an html format in the sample web test.
Some formative self-assessment tests for other groups of students have been trialed for web delivery, an example can be found at http://22.214.171.124/eb/adme/adme.htm. Note that this will run in Explorer but not Netscape and may initially appear to be unavailable. Click Go and it should run!
John O’Neill has recently set up a directory in the biology student area so that we can do assessment MCQ tests in a terminal classroom with a group of students. They will run the test from a shared directory and we should be able to collect the class results immediately and to review the student performance on individual questions. Watch this space and LT2003 (the annual Faculty learning and teaching event)!
If you already have MCQs available in one format, perhaps just simple text, it is obviously advantageous if they can be moved between the different formats used by each CAA tool. Undoubtedly Question Tools fails in this respect. Although it is possible to cut and paste small pieces of text into the question construction boxes this is unsatisfactory for more than a few questions. Similarly, if you are starting with a blank sheet there is no simple way to export questions from Question Tools into other applications. The other three systems all have some import and export capability. For example, WebCT questions can be imported and exported as simple text files, although there is a rigid structure to the question format. A PracticeMill test can be exported to a text file in a format that enables it to be imported into Respondus with very little editing and then it could be transferred to WebCT.
If you want to explore CAA without a major commitment of time or resources then it is probably wise to begin by investigating free systems such as Question Tools (and the untested Hot Potatoes). Practice Mill could also be a useful starting point for developing CAA. It is relatively easy to set up a bank of simple MCQs that can be presented to students for self-assessment in a variety of ways. All staff in the Science and Engineering Faculty can have a full individual copy via our site licence. All of the web sites of the reviewed software allow you to run through sample tests and all, except WebCT, can be downloaded and tested as time-limited trials. Alternatively you could ask one the 120 colleagues who are already using WebCT, to share their experiences with you. The Learning & Teaching Unit explain how WebCT is used within MMU on their web site (http://www.celt.mmu.ac.uk/online_learning/welcome.htm). The WebCT training programme is available at http://www.celt.mmu.ac.uk/professional_support/prog.htm#webct, and includes sessions on using WebCT for assessment.
0161 247 1198
0161 247 1199