Can Language Models Reason About ICD Codes to Guide the Generation of Clinical Notes?
Document Type
Article
Publication Date
2025
Publication Title
EasyChair Preprints
Pages
15731 (1-13)
Abstract
In the past decade a surge in the amount of electronic health record (EHR) data in the United States, attributed to a favorable policy environment created by the Health Information Technology for Economic and Clinical Health (HITECH) Act of 2009 and the 21st Century Cures Act of 2016. Clinical notes for patients’ assessments, diagnoses, and treatments are captured in these EHRs in free-form text by physicians, who spend a considerable amount of time entering them. Manually writing clinical notes may take considerable amount of time, increasing the patient’s waiting time and could possibly delay diagnoses. Large language models (LLMs), such as GPT-3 possess the ability to generate news articles that closely resemble human-written ones. We investigate the usage of Chain-of-Thought (CoT) prompt engineering to improve the LLM’s response in clinical note generation. In our prompts, we incorporate International Classification of Diseases (ICD) codes and basic patient information along with similar clinical case examples to investigate how LLMs can effectively formulate clinical notes. We tested our CoT prompt technique on six clinical cases from the CodiEsp test dataset using GPT-4 as our LLM and our results show that it outperformed the standard zero-shot prompt.
Rights
© 2025 The Authors.
Included with the kind written permission of the authors.
Original Publication Citation
Makohon, I., Wu, J., Feng, B., & Li, Y. (2025). Can language models reason about ICD codes to guide the generation of clinical notes? EasyChair Preprints. https://easychair.org/publications/preprint/HPFd
Repository Citation
Makohon, I., Wu, J., Feng, B., & Li, Y. (2025). Can language models reason about ICD codes to guide the generation of clinical notes? EasyChair Preprints. https://easychair.org/publications/preprint/HPFd
ORCID
0000-0002-3627-7242 (Makohon), 0000-0003-0173-4463 (Wu), 0000-0002-7892-5295 (Li)
Comments
This is a preprint, that has not undergone peer review.