Anna Fink, Alexander Rau, Elmar Kotter, Fabian Bamberg, Maximilian Frederik Russe
{"title":"[与大型语言模型的优化交互:提示工程和检索增强生成的实用指南]。","authors":"Anna Fink, Alexander Rau, Elmar Kotter, Fabian Bamberg, Maximilian Frederik Russe","doi":"10.1007/s00117-025-01416-2","DOIUrl":null,"url":null,"abstract":"<p><strong>Background: </strong>Given the increasing number of radiological examinations, large language models (LLMs) offer promising support in radiology. Optimized interaction is essential to ensure reliable results.</p><p><strong>Objectives: </strong>This article provides an overview of interaction techniques such as prompt engineering, zero-shot learning, and retrieval-augmented generation (RAG) and gives practical tips for their application in radiology.</p><p><strong>Materials and methods: </strong>Demonstration of interaction techniques based on practical examples with concrete recommendations for their application in routine radiological practice.</p><p><strong>Results: </strong>Advanced interaction techniques allow task-specific adaptation of LLMs without the need for retraining. The creation of precise prompts and the use of zero-shot and few-shot learning can significantly improve response quality. RAG enables the integration of current and domain-specific information into LLM tools, increasing the accuracy and relevance of the generated content.</p><p><strong>Conclusions: </strong>The use of prompt engineering, zero-shot and few-shot learning, and RAG can optimize interaction with LLMs in radiology. Through these targeted strategies, radiologists can efficiently integrate general chatbots into routine practice to improve patient care.</p>","PeriodicalId":74635,"journal":{"name":"Radiologie (Heidelberg, Germany)","volume":" ","pages":"235-242"},"PeriodicalIF":0.0000,"publicationDate":"2025-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"[Optimized interaction with Large Language Models : A practical guide to Prompt Engineering and Retrieval-Augmented Generation].\",\"authors\":\"Anna Fink, Alexander Rau, Elmar Kotter, Fabian Bamberg, Maximilian Frederik Russe\",\"doi\":\"10.1007/s00117-025-01416-2\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><strong>Background: </strong>Given the increasing number of radiological examinations, large language models (LLMs) offer promising support in radiology. Optimized interaction is essential to ensure reliable results.</p><p><strong>Objectives: </strong>This article provides an overview of interaction techniques such as prompt engineering, zero-shot learning, and retrieval-augmented generation (RAG) and gives practical tips for their application in radiology.</p><p><strong>Materials and methods: </strong>Demonstration of interaction techniques based on practical examples with concrete recommendations for their application in routine radiological practice.</p><p><strong>Results: </strong>Advanced interaction techniques allow task-specific adaptation of LLMs without the need for retraining. The creation of precise prompts and the use of zero-shot and few-shot learning can significantly improve response quality. RAG enables the integration of current and domain-specific information into LLM tools, increasing the accuracy and relevance of the generated content.</p><p><strong>Conclusions: </strong>The use of prompt engineering, zero-shot and few-shot learning, and RAG can optimize interaction with LLMs in radiology. Through these targeted strategies, radiologists can efficiently integrate general chatbots into routine practice to improve patient care.</p>\",\"PeriodicalId\":74635,\"journal\":{\"name\":\"Radiologie (Heidelberg, Germany)\",\"volume\":\" \",\"pages\":\"235-242\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2025-04-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Radiologie (Heidelberg, Germany)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1007/s00117-025-01416-2\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"2025/2/21 0:00:00\",\"PubModel\":\"Epub\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Radiologie (Heidelberg, Germany)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1007/s00117-025-01416-2","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2025/2/21 0:00:00","PubModel":"Epub","JCR":"","JCRName":"","Score":null,"Total":0}
[Optimized interaction with Large Language Models : A practical guide to Prompt Engineering and Retrieval-Augmented Generation].
Background: Given the increasing number of radiological examinations, large language models (LLMs) offer promising support in radiology. Optimized interaction is essential to ensure reliable results.
Objectives: This article provides an overview of interaction techniques such as prompt engineering, zero-shot learning, and retrieval-augmented generation (RAG) and gives practical tips for their application in radiology.
Materials and methods: Demonstration of interaction techniques based on practical examples with concrete recommendations for their application in routine radiological practice.
Results: Advanced interaction techniques allow task-specific adaptation of LLMs without the need for retraining. The creation of precise prompts and the use of zero-shot and few-shot learning can significantly improve response quality. RAG enables the integration of current and domain-specific information into LLM tools, increasing the accuracy and relevance of the generated content.
Conclusions: The use of prompt engineering, zero-shot and few-shot learning, and RAG can optimize interaction with LLMs in radiology. Through these targeted strategies, radiologists can efficiently integrate general chatbots into routine practice to improve patient care.