TY - JOUR
T1 - Impact of panelists' experience on script concordance test scores of medical students
AU - Peyrony, Olivier
AU - Hutin, Alice
AU - Truchot, Jennifer
AU - Borie, Raphaël
AU - Calvet, David
AU - Albaladejo, Adrien
AU - Baadj, Yousrah
AU - Cailleaux, Pierre Emmanuel
AU - Flamant, Martin
AU - Martin, Clémence
AU - Messika, Jonathan
AU - Meunier, Alexandre
AU - Mirabel, Mariana
AU - Tea, Victoria
AU - Treton, Xavier
AU - Chevret, Sylvie
AU - Lebeaux, David
AU - Roux, Damien
N1 - Publisher Copyright:
© 2020 The Author(s).
PY - 2020/9/17
Y1 - 2020/9/17
N2 - Background: The evaluation process of French medical students will evolve in the next few years in order to improve assessment validity. Script concordance testing (SCT) offers the possibility to assess medical knowledge alongside clinical reasoning under conditions of uncertainty. In this study, we aimed at comparing the SCT scores of a large cohort of undergraduate medical students, according to the experience level of the reference panel. Methods: In 2019, the authors developed a 30-item SCT and sent it to experts with varying levels of experience. Data analysis included score comparisons with paired Wilcoxon rank sum tests and concordance analysis with Bland & Altman plots. Results: A panel of 75 experts was divided into three groups: 31 residents, 21 non-experienced physicians (NEP) and 23 experienced physicians (EP). Among each group, random samples of N = 20, 15 and 10 were selected. A total of 985 students from nine different medical schools participated in the SCT examination. No matter the size of the panel (N = 20, 15 or 10), students' SCT scores were lower with the NEP group when compared to the resident panel (median score 67.1 vs 69.1, p < 0.0001 if N = 20; 67.2 vs 70.1, p < 0.0001 if N = 15 and 67.7 vs 68.4, p < 0.0001 if N = 10) and with EP compared to NEP (65.4 vs 67.1, p < 0.0001 if N = 20; 66.0 vs 67.2, p < 0.0001 if N = 15 and 62.5 vs 67.7, p < 0.0001 if N = 10). Bland & Altman plots showed good concordances between students' SCT scores, whatever the experience level of the expert panel. Conclusions: Even though student SCT scores differed statistically according to the expert panels, these differences were rather weak. These results open the possibility of including less-experienced experts in panels for the evaluation of medical students.
AB - Background: The evaluation process of French medical students will evolve in the next few years in order to improve assessment validity. Script concordance testing (SCT) offers the possibility to assess medical knowledge alongside clinical reasoning under conditions of uncertainty. In this study, we aimed at comparing the SCT scores of a large cohort of undergraduate medical students, according to the experience level of the reference panel. Methods: In 2019, the authors developed a 30-item SCT and sent it to experts with varying levels of experience. Data analysis included score comparisons with paired Wilcoxon rank sum tests and concordance analysis with Bland & Altman plots. Results: A panel of 75 experts was divided into three groups: 31 residents, 21 non-experienced physicians (NEP) and 23 experienced physicians (EP). Among each group, random samples of N = 20, 15 and 10 were selected. A total of 985 students from nine different medical schools participated in the SCT examination. No matter the size of the panel (N = 20, 15 or 10), students' SCT scores were lower with the NEP group when compared to the resident panel (median score 67.1 vs 69.1, p < 0.0001 if N = 20; 67.2 vs 70.1, p < 0.0001 if N = 15 and 67.7 vs 68.4, p < 0.0001 if N = 10) and with EP compared to NEP (65.4 vs 67.1, p < 0.0001 if N = 20; 66.0 vs 67.2, p < 0.0001 if N = 15 and 62.5 vs 67.7, p < 0.0001 if N = 10). Bland & Altman plots showed good concordances between students' SCT scores, whatever the experience level of the expert panel. Conclusions: Even though student SCT scores differed statistically according to the expert panels, these differences were rather weak. These results open the possibility of including less-experienced experts in panels for the evaluation of medical students.
KW - Medical student
KW - Panelist
KW - Script concordance test
UR - http://www.scopus.com/inward/record.url?scp=85091192845&partnerID=8YFLogxK
U2 - 10.1186/s12909-020-02243-w
DO - 10.1186/s12909-020-02243-w
M3 - Article
C2 - 32943030
AN - SCOPUS:85091192845
SN - 1472-6920
VL - 20
JO - BMC Medical Education
JF - BMC Medical Education
IS - 1
M1 - 313
ER -