Implementing computerized adaptative test
DOI:
https://doi.org/10.15448/1980-6108.2019.3.34432Keywords:
Medical education, educational assessment, computerized adaptive test.Abstract
Traditionally, the assessment of knowledge consists of items, who students answered the same items at the same time, such as test of a specific subject. This assessment may be considered too easy or difficulty by the student. In both cases, the test is likely to be boring by the students and it may provide little information on students’ knowledge level. One way of solving this problem is by creating tailored tests for each student, considering that the next question will be selected based on students’ performance on previous items. This type of test is known as computerized adaptative test. Computerized adaptative test provides both educational and psychometrics advantages compared to the traditional paper-pen testing. Computerized adaptative test requires less items than the traditional test, which in turns will decrease students’ fatigue, and optimizing learning. Furthermore, computerized adaptative test is designed for each student, considering the level of difficulty of each item. This makes the teste more attractive and authentic, since the items will be always aligned with the level of students’ knowledge. Since computerized adaptative test requires both the difficulty of the item and students’ ability, it requires the use of Item Response Theory, which establish a relation between difficulty of the item, students’ ability and the probability of answering a question correctly. Although the implementation of computerized adaptative test is complex, computerized adaptative test has a higher standard in both psychometric point of view and the alignment with modern theories of learning. Because of the high complexity, the implementation of computerized adaptative test is usually in high-stakes test and large scale. However, the new educational paradigm in which requires tailored-made education respecting the pace of each student, the computerized adaptative test will be more used over time.
Downloads
References
Wass V, Van der Vleuten C, Shatzer J, Jones R. Assessment of clinical competence. Lancet. 2001;357(9260):945-9. https://doi.org/10.1016/S0140-6736(00)04221-5
Wood T. Assessment not only drives learning, it may also help learning. Med Educ 2009;43(1):5-6. https://doi.org/10.1111/j.1365-2923.2008.03237.x
Cecilio-Fernandes D, Cohen-Schotanus J, Tio RA. Assessment programs to enhance learning. Phys Ther Rev. 2018;23(1):17-20. https://doi.org/10.1080/10833196.2017.1341143
Bloom BS. Taxonomy of educational objectives: the classification of education goals. New York: Longman; 1956. v. 1.
Cecilio-Fernandes D, Kerdijk W, Jaarsma ADC, Tio RA. Development of cognitive processing and judgments of knowledge in medical students: analysis of progress test results. Med Teach. 2016;38(11):1125-9. https://doi.org/10.3109/0142159X.2016.1170781
Cecilio-Fernandes D, Kerdijk W, Bremers AJ, Aalders W, Tio RA. Comparison of level of cognitive process between case-based items and non-case-based items of the interuniversity progress test of medicine in the Netherlands. J Educ Eval Health Prof. 2018;15:28. https://doi.org/10.3352/jeehp.2018.15.28
Collares CF, Cecilio-Fernandes D. When I say ... computerised adaptive testing. Med Educ. 2019;53(2):115-6. https://doi.org/10.1111/medu.13648
Van der Linden WJ, Glas CAW, editors. Computerized adaptive testing: theory and practice. Dordrecht: Kluwer Academic; 2000. https://doi.org/10.1007/0-306-47531-6
Weiss DJ. Computerized adaptive testing for effective and efficient measurement in counseling and education. Meas Eval Couns Dev. 2004;37(2):70-84. https://doi.org/10.1080/07481756.2004.11909751
Chang H. Psychometrics behind computerized adaptive testing. Psychometrika. 2015;80(1):1-20. https://doi.org/10.1007/s11336-014-9401-5
Martin AJ, Lazendic G. Computer-adaptive testing: implications for students’ achievement, motivation, engagement, and subjective test experience. J Educ Psychol. 2018;110(1):27-45. http://dx.doi.org/10.1037/edu0000205
Beatty J. Task-evoked pupillary responses, processing load, and the structure of processing resources. Psychol Bull. 1982;91(2):276-92. http://dx.doi.org/10.1037/0033-2909.91.2.276
Thompson NA, Weiss DJ. A framework for the development of computerized adaptive tests. Pract Asses Res Evaluation. 2011;16(1):1-9.
Linacre J. Sample Size and Item Calibration Stability. Rasch Meas Trans. 1994;7(4):328.
Cecilio-Fernandes D, Medema H, Collares CF, Schuwirth L, Cohen-Schotanus J, Tio RA. Comparison of formula and number-right scoring in undergraduate medical training: a Rasch model analysis. BMC Med Educ. 2017;17:192. http://dx.doi.org/10.1186/s12909-017-1051-8
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2019 Scientia Medica
This work is licensed under a Creative Commons Attribution 4.0 International License.
Copyright
The submission of originals to Scientia Medica implies the transfer by the authors of the right for publication. Authors retain copyright and grant the journal right of first publication. If the authors wish to include the same data into another publication, they must cite Scientia Medica as the site of original publication.
Creative Commons License
Except where otherwise specified, material published in this journal is licensed under a Creative Commons Attribution 4.0 International license, which allows unrestricted use, distribution and reproduction in any medium, provided the original publication is correctly cited.