Learning and Teaching in Action: Assessment

Student in language lab

 

Paper Review

 

Gibbs, G., Dunbat-Goddet, H. (2007) ‘The effects of programme assessment environments on student learning’ The Higher Education Academy

(Available on-line)

In this study the authors were investigating the characteristics of assessment in three contrasting disciplines in three contrasting universities (Oxbridge, pre-1992 and post-1992). In order to achieve this they analysed documentation from the different programmes and interviewed the Director of Studies (or the equivalent). From this information they developed a coding system in which a number of assessment characteristics (such as the variety of assessment, volume of formative and summative assessment, volume of feedback, explicitness of criteria and standards, etc) were defined as ‘low’ ‘medium’ or ‘high’. They also used a version of the ‘Assessment Experience Questionnaire’ (AEQ), developed by Gibbs and Simpson in 2003 to survey students on the same programmes about their assessment environment. The AEQ had been revised following previous use in order to improve ‘its psychometric characteristics’ and to make it ‘appropriate to measure students’ experience of assessment environments of entire programmes rather than only individual course units’. This programme-level questionnaire had:

  • Five scales covering ‘quantity of effort, coverage of syllabus, quantity and quality of feedback, use of feedback, and learning from the examination’.
  • Two scales covering ‘appropriate assessment’ and ‘clear goals and standards’ (course Experience Questionnaire, Ramsden, 1991).
  • Six items on student approaches to learning (surface and deep).
  • An overall item on student satisfaction.

The questionnaire was completed by 516 students representing a 42% return overall. The paper gives details of factor analysis to show that the AEQ was able to distinguish between programmes. Students were also interviewed (no details given of the number of students interviewed) although the paper does report examples from interviews covering the AEQ questions.

The paper showed that the three types of university had distinctive assessment environments: Oxbridge was relatively high in terms of the proportion of marks coming from examination, the volume formative assessment and of oral feedback, though low in assessment variety, volume of summative assessment, alignment of goals and assessment. The pattern of post-1992 assessment characteristics was a mirror image of the Oxbridge model. The pre-1992 university came somewhere in the middle.

Some of the results from this paper should come as no surprise: for example, where there is a greater percentage of marks coming from examination, there is less variety of assessment; where the percentage of marks coming from examinations is high, there is less summative assessment and more formative assessment. What is more interesting is the relationship between high degrees of alignment and explicit criteria and the student responses is the AEQ. For example, assessment environments where goals were explicit and there was a high degree of alignment of goals and assessment were associated with negative student learning responses both in the AEQ and in focus groups. Students in these environments tended to narrow their efforts to those areas which would be assessed and were less satisfied with their studies. In addition, explicitness of goals and standards was not always associated with greater clarity about goals. A high variety of assessments coupled with links to goals and standards seemed to be linked to negative learning outcomes. Perhaps we need to think more about how we express those links between assessment and goals which students in those types of assessment environments seem to find confusing. I would not, however, use this as an argument for reducing variety. If too much variety is needed to fulfil learning outcomes, then perhaps consider reducing the number of learning outcomes. In addition, the recommendation for more formative and less summative assessment is not something I would argue against.

Maureen Dawson
Centre for Learning and Teaching

e-mail: m.m.dawson@mmu.ac.uk

Autumn 2008
ISSN 1477-1241