Think! Evidence

Using item response theory to explore the psychometric properties of extended matching questions examination in undergraduate medical education

Show simple item record

dc.creator Lawton Gemma
dc.creator Horton Mike
dc.creator Tennant Alan
dc.creator Bhakta Bipin
dc.creator Andrich David
dc.date 2005-03-01T00:00:00Z
dc.date.accessioned 2015-08-12T11:20:27Z
dc.date.available 2015-08-12T11:20:27Z
dc.identifier 10.1186/1472-6920-5-9
dc.identifier 1472-6920
dc.identifier https://doaj.org/article/93883d7345bf463ba1dd5bc621a2732b
dc.identifier.uri http://evidence.thinkportal.org/handle/123456789/28294
dc.description <p>Abstract</p> <p>Background</p> <p>As assessment has been shown to direct learning, it is critical that the examinations developed to test clinical competence in medical undergraduates are valid and reliable. The use of extended matching questions (EMQ) has been advocated to overcome some of the criticisms of using multiple-choice questions to test factual and applied knowledge.</p> <p>Methods</p> <p>We analysed the results from the Extended Matching Questions Examination taken by 4<sup>th </sup>year undergraduate medical students in the academic year 2001 to 2002. Rasch analysis was used to examine whether the set of questions used in the examination mapped on to a unidimensional scale, the degree of difficulty of questions within and between the various medical and surgical specialties and the pattern of responses within individual questions to assess the impact of the distractor options.</p> <p>Results</p> <p>Analysis of a subset of items and of the full examination demonstrated internal construct validity and the absence of bias on the majority of questions. Three main patterns of response selection were identified.</p> <p>Conclusion</p> <p>Modern psychometric methods based upon the work of Rasch provide a useful approach to the calibration and analysis of EMQ undergraduate medical assessments. The approach allows for a formal test of the unidimensionality of the questions and thus the validity of the summed score. Given the metric calibration which follows fit to the model, it also allows for the establishment of items banks to facilitate continuity and equity in exam standards.</p>
dc.language English
dc.publisher BioMed Central
dc.relation http://www.biomedcentral.com/1472-6920/5/9
dc.relation https://doaj.org/toc/1472-6920
dc.rights CC BY
dc.source BMC Medical Education, Vol 5, Iss 1, p 9 (2005)
dc.subject Medicine (General)
dc.subject R5-920
dc.subject Medicine
dc.subject R
dc.subject DOAJ:Medicine (General)
dc.subject DOAJ:Health Sciences
dc.subject Special aspects of education
dc.subject LC8-6691
dc.subject Education
dc.subject L
dc.subject DOAJ:Education
dc.subject DOAJ:Social Sciences
dc.subject Medicine (General)
dc.subject R5-920
dc.subject Medicine
dc.subject R
dc.subject DOAJ:Medicine (General)
dc.subject DOAJ:Health Sciences
dc.subject Special aspects of education
dc.subject LC8-6691
dc.subject Education
dc.subject L
dc.subject DOAJ:Education
dc.subject DOAJ:Social Sciences
dc.subject Medicine (General)
dc.subject R5-920
dc.subject Medicine
dc.subject R
dc.subject Special aspects of education
dc.subject LC8-6691
dc.subject Education
dc.subject L
dc.subject Medicine (General)
dc.subject R5-920
dc.subject Medicine
dc.subject R
dc.subject Special aspects of education
dc.subject LC8-6691
dc.subject Education
dc.subject L
dc.subject Medicine (General)
dc.subject R5-920
dc.subject Medicine
dc.subject R
dc.subject Special aspects of education
dc.subject LC8-6691
dc.subject Education
dc.subject L
dc.title Using item response theory to explore the psychometric properties of extended matching questions examination in undergraduate medical education
dc.type article


Files in this item

Files Size Format View

There are no files associated with this item.

This item appears in the following Collection(s)

Show simple item record

Search Think! Evidence


Browse

My Account