可持续生成人工智能的绿色提示工程

IF 14.3 1区 环境科学与生态学 Q1 ENVIRONMENTAL SCIENCES
Environmental Science and Ecotechnology Pub Date : 2026-03-01 Epub Date: 2026-03-13 DOI:10.1016/j.ese.2026.100684
Sanjay Podder, Hema Date, Shankar Murthy
{"title":"可持续生成人工智能的绿色提示工程","authors":"Sanjay Podder,&nbsp;Hema Date,&nbsp;Shankar Murthy","doi":"10.1016/j.ese.2026.100684","DOIUrl":null,"url":null,"abstract":"<div><div>Prompt engineering involves manual design and optimization of text-based instructions or queries, enabling precise control over outputs generated by pre-trained large language models (LLMs) and ensuring alignment with desired responses. However, substantial computational costs and energy footprint of prompt inferencing process remain critical challenges while building generative AI applications. The energy efficiency of LLM inferences is particularly impacted by suboptimal prompts, which may require multiple iterations, thereby escalating energy consumption and the associated carbon footprint. To address these challenges, we propose a series of practices and guidelines designed to enhance the likelihood of obtaining desired responses from LLMs with minimal reiterations. Empirical evaluation demonstrates that, across a range of LLMs and test scenarios, energy consumption and corresponding operational greenhouse gas emissions were reduced by 32–48% when best practices were applied. Drawing upon these insights, our proposed best practices can be seamlessly integrated into the design frameworks of generative AI applications, thereby enhancing the energy efficiency of prompt inferencing. By addressing the challenge of establishing a cohesive framework for energy-efficient prompt design and inferencing, this paper advocates for the sustainable and effective deployment of generative AI technologies.</div></div>","PeriodicalId":34434,"journal":{"name":"Environmental Science and Ecotechnology","volume":"30 ","pages":"Article 100684"},"PeriodicalIF":14.3000,"publicationDate":"2026-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Green prompt engineering for sustainable generative AI\",\"authors\":\"Sanjay Podder,&nbsp;Hema Date,&nbsp;Shankar Murthy\",\"doi\":\"10.1016/j.ese.2026.100684\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Prompt engineering involves manual design and optimization of text-based instructions or queries, enabling precise control over outputs generated by pre-trained large language models (LLMs) and ensuring alignment with desired responses. However, substantial computational costs and energy footprint of prompt inferencing process remain critical challenges while building generative AI applications. The energy efficiency of LLM inferences is particularly impacted by suboptimal prompts, which may require multiple iterations, thereby escalating energy consumption and the associated carbon footprint. To address these challenges, we propose a series of practices and guidelines designed to enhance the likelihood of obtaining desired responses from LLMs with minimal reiterations. Empirical evaluation demonstrates that, across a range of LLMs and test scenarios, energy consumption and corresponding operational greenhouse gas emissions were reduced by 32–48% when best practices were applied. Drawing upon these insights, our proposed best practices can be seamlessly integrated into the design frameworks of generative AI applications, thereby enhancing the energy efficiency of prompt inferencing. By addressing the challenge of establishing a cohesive framework for energy-efficient prompt design and inferencing, this paper advocates for the sustainable and effective deployment of generative AI technologies.</div></div>\",\"PeriodicalId\":34434,\"journal\":{\"name\":\"Environmental Science and Ecotechnology\",\"volume\":\"30 \",\"pages\":\"Article 100684\"},\"PeriodicalIF\":14.3000,\"publicationDate\":\"2026-03-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Environmental Science and Ecotechnology\",\"FirstCategoryId\":\"93\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S2666498426000293\",\"RegionNum\":1,\"RegionCategory\":\"环境科学与生态学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"2026/3/13 0:00:00\",\"PubModel\":\"Epub\",\"JCR\":\"Q1\",\"JCRName\":\"ENVIRONMENTAL SCIENCES\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Environmental Science and Ecotechnology","FirstCategoryId":"93","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2666498426000293","RegionNum":1,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2026/3/13 0:00:00","PubModel":"Epub","JCR":"Q1","JCRName":"ENVIRONMENTAL SCIENCES","Score":null,"Total":0}
引用次数: 0

摘要

即时工程涉及基于文本的指令或查询的手动设计和优化,能够精确控制由预训练的大型语言模型(llm)生成的输出,并确保与期望的响应保持一致。然而,在构建生成式人工智能应用程序时,快速推理过程的大量计算成本和能源足迹仍然是关键挑战。LLM推断的能源效率特别受到次优提示的影响,这可能需要多次迭代,从而增加能源消耗和相关的碳足迹。为了应对这些挑战,我们提出了一系列的实践和指导方针,旨在以最少的重复来提高法学硕士获得预期回应的可能性。经验评估表明,在一系列llm和测试场景中,当采用最佳实践时,能源消耗和相应的操作温室气体排放减少了32-48%。根据这些见解,我们提出的最佳实践可以无缝集成到生成式人工智能应用程序的设计框架中,从而提高快速推理的能源效率。通过解决为节能快速设计和推理建立一个有凝聚力的框架的挑战,本文倡导可持续和有效地部署生成人工智能技术。
本文章由计算机程序翻译,如有差异,请以英文原文为准。

Green prompt engineering for sustainable generative AI

Green prompt engineering for sustainable generative AI
Prompt engineering involves manual design and optimization of text-based instructions or queries, enabling precise control over outputs generated by pre-trained large language models (LLMs) and ensuring alignment with desired responses. However, substantial computational costs and energy footprint of prompt inferencing process remain critical challenges while building generative AI applications. The energy efficiency of LLM inferences is particularly impacted by suboptimal prompts, which may require multiple iterations, thereby escalating energy consumption and the associated carbon footprint. To address these challenges, we propose a series of practices and guidelines designed to enhance the likelihood of obtaining desired responses from LLMs with minimal reiterations. Empirical evaluation demonstrates that, across a range of LLMs and test scenarios, energy consumption and corresponding operational greenhouse gas emissions were reduced by 32–48% when best practices were applied. Drawing upon these insights, our proposed best practices can be seamlessly integrated into the design frameworks of generative AI applications, thereby enhancing the energy efficiency of prompt inferencing. By addressing the challenge of establishing a cohesive framework for energy-efficient prompt design and inferencing, this paper advocates for the sustainable and effective deployment of generative AI technologies.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
20.40
自引率
6.30%
发文量
11
审稿时长
18 days
期刊介绍: Environmental Science & Ecotechnology (ESE) is an international, open-access journal publishing original research in environmental science, engineering, ecotechnology, and related fields. Authors publishing in ESE can immediately, permanently, and freely share their work. They have license options and retain copyright. Published by Elsevier, ESE is co-organized by the Chinese Society for Environmental Sciences, Harbin Institute of Technology, and the Chinese Research Academy of Environmental Sciences, under the supervision of the China Association for Science and Technology.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信
小红书