Clinical Note Generation to Address Physician Burnout
Physician burnout contributes significantly to the decreasing quality and personalization of clinical visits. Burnout in large part can be attributed to tedious and inefficient current EHR systems, as each visit can cost several hours of documentation afterwards. In this paper, we build ClinicalGPT-2, a language model that helps generate clinical note contents, which could be deployed as part of an auto-complete system to increase efficiency of the clinical visit documentation process. We are one of the first to utilize the GPT-2 architecture for this objective, and results show that a small GPT-2 model finetuned on the MIMIC-III clinical note corpus can replicate note structure quite dependably. Furthermore, it often fills in contents of reasonable length and semantic appropriateness. The same model struggles to handle medical abbreviations, special characters, and nuanced formatting, illustrating the importance of data quality and pre-processing. Our findings hold great implications for the feasibility of using such models in text-prediction software under real-life clinical settings.