Skip to main content

Advertisement

ADVERTISEMENT

PCRF

PCRF Research Alert: Test-Enhanced Learning

Megan Corry

This ongoing series from the Prehospital Care Research Forum combs the literature to identify recent studies relevant to EMS education practices. 

Green ML, Moeller JJ, Spak JM. Test-enhanced learning in health professions education: A systematic review: BEME Guide No. 48. Med Teach, 2018 Apr; 40(4): 337–50. 

The most common reason we test students is to assess competence. Decades of research also support the use of testing as a pedagogical strategy to enhance student learning. In fact, the “test effect” has been demonstrated repeatedly at a variety of levels and environments, such as cognitive psychology labs, classrooms, simulation, and in practical settings. 

In health-professions education programs, frequent testing (also known as retrieval practice) has proven superior to studying or rereading content. The test effect is stronger with frequent repeated tests that are spaced over time and accompanied by feedback, so students learn from their mistakes. Although reviews of the literature have been published, a systematic review of the effect of test-enhanced learning (TEL) interventions in the setting of healthcare-professions education programs has been lacking. 

Researchers at the Yale School of Medicine sought to measure the effectiveness of TEL on recall, retention, and transfer of learning, and to quantify the magnitude of effect across a variety of published studies. They further evaluated the setting, test format, timing, and other cointerventions (e.g., feedback) and their effect on learning outcomes. Only randomized, controlled studies were included, and only those in which the students were evaluated using objective criteria (not student satisfaction surveys). They also used the MERSQI scale to ensure quality of educational research. MERSQI, for Medical Education Research Study Quality Instrument, is a validation tool that reliably identifies the methodological strength of quantitative studies. Its scoring system consists of 10 items within six domains of quality. Studies included in this systematic review had scores ranging between 9 and 15.5 out of a possible 18 points. 

Database searches and manual table-of-contents reviews resulted in over 6,000 records. After screening for duplicates and TEL content, only 88 remained for full text review. Three dozen more were excluded for lack of relevance to the study question, and another 33 were excluded for the lack of an appropriate control group. This left 19 total articles for review and inclusion in this study. Outcomes were grouped into immediate, retention, and transfer outcomes. Comparison of effects was calculated by the standard mean difference (SMD). SMDs of greater than 0.5 have significant practical importance. 

Student subjects in the 19 studies included medical students (8), nursing students (3), allied health students (3), residents (3), and physicians in CME programs (2). TEL interventions included multiple-choice and short-answer questions, simulation (resuscitation), standardized patients, and clinical reasoning situations. Cointerventions included feedback and self-explanation. Results showed that 5/6 immediate-learning outcomes and 21/23 retention outcomes (1 week to 6 months after TEL) favored TEL over studying. Three studies with 7 transfer outcomes also favored TEL over studying. 

Not surprisingly, studies that reassessed students at later dates found decay in the TEL effect. A comparison of TEL strategies showed that high-level test items, such as scenario-based questions, showed a more positive effect on retention than simple knowledge-based items. Although cointerventions (feedback, self-explanation) were used, none of these studies looked at the independent effects of these interventions on learning and retention outcomes. TEL strategies that used repeated, spaced tests with feedback and high-level questions had the greatest impact on immediate and long-term learning outcomes. 

Like any systematic review, the limitations of this research included publication bias. The authors could have missed smaller, nonrandomized studies, qualitative research, and research that went unpublished due to negative results. There is already a large volume of literature that clearly shows positive outcomes of the test effect from a variety of educational fields. Consider the high volume of content learned in EMS education and potential impact on patient outcomes. The results of this systematic review point to TEL as an essential teaching strategy in our EMS education programs, with a potential to positively impact both student and patient outcomes. 

For further discussion join our inaugural PCRF/NAEMSE Education Research Podcast on September 28, 2018, at 10 a.m. PT/12 p.m. CT. Check back soon for a direct registration link. 

Megan Corry, EdD, EMT-P, is the program director and full-time faculty for the City College of San Francisco paramedic program and on the board of advisors of the UCLA Prehospital Care Research Forum.

 

Advertisement

Advertisement

Advertisement