
RESEARCH LIBRARY
RESEARCH LIBRARY
View the latest publications from members of the NBME research team
Advancing Natural Language Processing in Educational Assessment
This book examines the use of natural language technology in educational testing, measurement, and assessment. Recent developments in natural language processing (NLP) have enabled large-scale educational applications, though scholars and professionals may lack a shared understanding of the strengths and limitations of NLP in assessment as well as the challenges that testing organizations face in implementation. This first-of-its-kind book provides evidence-based practices for the use of NLP-based approaches to automated text and speech scoring, language proficiency assessment, technology-assisted item generation, gamification, learner feedback, and beyond.
Advancing Natural Language Processing in Educational Assessment: Pages 167-182
This chapter discusses the evolution of natural language processing (NLP) approaches to text representation and how different ways of representing text can be utilized for a relatively understudied task in educational assessment – that of predicting item characteristics from item text.
Advancing Natural Language Processing in Educational Assessment: Pages 58-73
This chapter describes INCITE, an NLP-based system for scoring free-text responses. It emphasizes the importance of context and the system’s intended use and explains how each component of the system contributed to its accuracy.
Similarities Between Clinically Matched and Unmatched Analogue Patient Raters: A Mixed Methods Study
Patient Education and Counseling: Volume 109, Supplement, April 2023, Page 2
Physicians' responses to patient communication were assessed by both clinically matched and unmatched analogue patients (APs). Significant correlations between their ratings indicated consistency in evaluating physician communication skills. Thematic analysis identified twenty-one common themes in both clinically matched and unmatched AP responses, suggesting similar assessments of important behaviors. These findings imply that clinically unmatched APs can effectively substitute for clinically matched ones in evaluating physician communication and offering feedback when the latter are unavailable.
Medical Teacher: Volume 45 - Issue 6, Pages 565-573
This guide aims aim to describe practical considerations involved in reading and conducting studies in medical education using Artificial Intelligence (AI), define basic terminology and identify which medical education problems and data are ideally-suited for using AI.
Diagnosis: Volume 10, Issue 1, Pages 54-60
This op-ed discusses the advantages of leveraging natural language processing (NLP) in the assessment of clinical reasoning. It also provides an overview of INCITE, the Intelligent Clinical Text Evaluator, a scalable NLP-based computer-assisted scoring system that was developed to measure clinical reasoning ability as assessed in the written documentation portion of the now-discontinued USMLE Step 2 Clinical Skills examination.
Academic Medicine: Volume 97 - Issue 11S - Page S176
As Step 1 begins to transition to pass/fail, it is interesting to consider the impact of score goal on wellness. This study examines the relationship between goal score, gender, and students’ self-reported anxiety, stress, and overall distress immediately following their completion of Step 1.
Journal of Graduate Medical Education: Volume 14, Issue 3, Pages 353-354
Letter to the editor.
Medical Science Educator: Volume 31, p 607–613 (2021)
This study extended previous research on the NBME Clinical Science Mastery Series self-assessments to investigate the utility of recently released self-assessments for students completing Family Medicine clerkships and Emergency Medicine sub-internships and preparing for summative assessments.
Teaching and Learning in Medicine: Volume 33 - Issue 4 - p 366-381
CSE scores for students from eight schools that moved Step 1 after core clerkships between 2012 and 2016 were analyzed in a pre-post format. Hierarchical linear modeling was used to quantify the effect of the curriculum on CSE performance. Additional analysis determined if clerkship order impacted clinical subject exam performance and whether the curriculum change resulted in more students scoring in the lowest percentiles before and after the curricular change.