Skip to main content

Advertisement

ADVERTISEMENT

Test Language Matters: Questioned Words by Saudi Arabian Paramedic Students

EMS World Expo 2018

Introduction—Item-writing guidelines for multiple-choice assessments have raised concerns regarding the focus, grammar, and language. Guidelines include: items should be important to learned content and not trivial information; use of concise, simple vocabulary; homogenous content and structure; and avoiding trick items. Difficult vocabulary places some students at risk because it affects the reading demand. The use of simplified language is supported as an effective way to reduce the influence of reading ability, a source of construct-irrelevant variance when the assessment is intended to measure something else.

Methods—In May 2015 44 senior paramedic students in Riyadh, Saudi Arabia, attempted Fisdap’s 200-item Paramedic Readiness Exam 4 (PRE4), a valid and reliable cognitive multiple-choice assessment. The students’ program was based on the standard U.S. curriculum and used U.S. textbooks. Participants wrote down words they did not know while taking PRE4.

Results—A total of 127 words were reported. Table 1 reports words (n=56) questioned by multiple participants. Table 2 reports words questioned by a single participant. Table 3 reports the group’s overall performance on PRE4. Table 4 reports the correlation between words questioned by multiple students (n=56) and PRE4 topic areas.

Conclusion—This study reported a mixture of questioned nonmedical and medical words. Items phrased with construct-irrelevant words introduced an unfair disadvantage for those learning English or with different cultural backgrounds. The language used within an assessment should reflect standards, guidelines, and reference materials, and it should adhere to a universal design for maximum accessibility.

Advertisement

Advertisement

Advertisement