Full Program »
The Impact On Learning Outcomes Using Three-Attempt Tests In An Engineering Undergraduate Core Course: Dynamics
Abstract
The rapid shift to online in assessment during COVID-19 encourages educators to reexamine the role of integrity of testing and its impact on learning outcomes. This paper focused on how much learning the students gained and how much knowledge acquired during the pandemic while the integrity of the assessment was still conserved. The virtual M-mode Dynamics class of 236 students was delivered via zoom on a weekly basis. In this course, three tests were given using Proctor Hub, Respondus LockDown Browser with webcam, for the students’ summative assessments. For each test, the students were permitted only one screen, as well they were instructed to show the camera what they were writing. The students were allotted a week to finish all three attempts for each test.
Given the large questions pool in which hundreds of problems, concepts and numerical, were pre-embedded into the CANVAS system prior to the pandemic, it was possible to conduct as alternative the approach of granting three-attempts computer-based assessment tests. The large pool of questions mitigated that the students would see the same questions again. Yet, after every attempt the students were allowed to come and discuss their test to increase their knowledge and improve their critical thinking before their next attempt.
The results of the first test quickly demonstrated the improvement in score on average was more than 30% between the first attempt and the third, owing to the mental encouragement and repetition of learning over a period of a week, going over the material multiple times. Using an anonymous survey, more than 98% of the students agreed that this style helped them learn more effectively than other examinations settings.
Keywords
Respondus LockDown Browser, computer-based assessment, three-attempt tests, Virtual M-mode, Zoom.