Uncategorized

Reasoning-Aware Diagnosis Framework with Prompt-Generated Rationales



Download a PDF of the paper titled Large Language Models are Clinical Reasoners: Reasoning-Aware Diagnosis Framework with Prompt-Generated Rationales, by Taeyoon Kwon and 9 other authors

Download PDF
HTML (experimental)

Abstract:Machine reasoning has made great progress in recent years owing to large language models (LLMs). In the clinical domain, however, most NLP-driven projects mainly focus on clinical classification or reading comprehension, and under-explore clinical reasoning for disease diagnosis due to the expensive rationale annotation with clinicians. In this work, we present a “reasoning-aware” diagnosis framework that rationalizes the diagnostic process via prompt-based learning in a time- and labor-efficient manner, and learns to reason over the prompt-generated rationales. Specifically, we address the clinical reasoning for disease diagnosis, where the LLM generates diagnostic rationales providing its insight on presented patient data and the reasoning path towards the diagnosis, namely Clinical Chain-of-Thought (Clinical CoT). We empirically demonstrate LLMs/LMs’ ability of clinical reasoning via extensive experiments and analyses on both rationale generation and disease diagnosis in various settings. We further propose a novel set of criteria for evaluating machine-generated rationales’ potential for real-world clinical settings, facilitating and benefiting future research in this area.

Submission history

From: Taeyoon Kwon [view email]
[v1]
Tue, 12 Dec 2023 16:14:45 UTC (1,981 KB)
[v2]
Tue, 13 Feb 2024 03:48:00 UTC (4,902 KB)



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *