Think! Evidence

Beyond model answers: learners' perceptions of self-assessment materials in e-learning applications

Show simple item record

dc.creator Karen Handley
dc.creator Benita Cox
dc.date 2007-12-01T00:00:00Z
dc.date.accessioned 2015-07-20T22:08:15Z
dc.date.available 2015-07-20T22:08:15Z
dc.identifier 10.3402/rlt.v15i1.10909
dc.identifier 2156-7069
dc.identifier 2156-7077
dc.identifier https://doaj.org/article/f0682c052679455ea5c3c42bb9a7b409
dc.identifier.uri http://evidence.thinkportal.org/handle/123456789/11854
dc.description The importance of feedback as an aid to self-assessment is widely acknowledged. A common form of feedback that is used widely in e-learning is the use of model answers. However, model answers are deficient in many respects. In particular, the notion of a ‘model' answer implies the existence of a single correct answer applicable across multiple contexts with no scope for permissible variation. This reductive assumption is rarely the case with complex problems that are supposed to test students' higher-order learning. Nevertheless, the challenge remains of how to support students as they assess their own performance using model answers and other forms of non-verificational ‘feedback'. To explore this challenge, the research investigated a management development e-learning application and investigated the effectiveness of model answers that followed problem-based questions. The research was exploratory, using semi-structured interviews with 29 adult learners employed in a global organisation. Given interviewees' generally negative perceptions of the model-answers, they were asked to describe their ideal form of self-assessment materials, and to evaluate nine alternative designs. The results suggest that, as support for higher-order learning, self-assessment materials that merely present an idealised model answer are inadequate. As alternatives, learners preferred materials that helped them understand what behaviours to avoid (and not just ‘do'), how to think through the problem (i.e. critical thinking skills), and the key issues that provide a framework for thinking. These findings have broader relevance within higher education, particularly in postgraduate programmes for business students where the importance of prior business experience is emphasised and the profile of students is similar to that of the participants in this research.
dc.language English
dc.relation http://www.researchinlearningtechnology.net/index.php/rlt/article/view/10909
dc.relation https://doaj.org/toc/2156-7069
dc.relation https://doaj.org/toc/2156-7077
dc.rights CC BY
dc.source Research in Learning Technology, Vol 15, Iss 1 (2007)
dc.subject Education (General)
dc.subject L7-991
dc.subject Education
dc.subject L
dc.subject DOAJ:Education
dc.subject DOAJ:Social Sciences
dc.subject Education (General)
dc.subject L7-991
dc.subject Education
dc.subject L
dc.subject DOAJ:Education
dc.subject DOAJ:Social Sciences
dc.subject Education (General)
dc.subject L7-991
dc.subject Education
dc.subject L
dc.subject Education (General)
dc.subject L7-991
dc.subject Education
dc.subject L
dc.subject Education (General)
dc.subject L7-991
dc.subject Education
dc.subject L
dc.title Beyond model answers: learners' perceptions of self-assessment materials in e-learning applications
dc.type article


Files in this item

Files Size Format View

There are no files associated with this item.

This item appears in the following Collection(s)

Show simple item record

Search Think! Evidence


Browse

My Account