Sustainability of large language models—user perspective

IF 10 1区 环境科学与生态学 Q1 ECOLOGY
Pavel Pipek, Shane Canavan, Susan Canavan, César Capinha, Jérôme MW Gippet, Ana Novoa, Petr Pyšek, Allan T Souza, Shengyu Wang, Ivan Jarić
{"title":"Sustainability of large language models—user perspective","authors":"Pavel Pipek,&nbsp;Shane Canavan,&nbsp;Susan Canavan,&nbsp;César Capinha,&nbsp;Jérôme MW Gippet,&nbsp;Ana Novoa,&nbsp;Petr Pyšek,&nbsp;Allan T Souza,&nbsp;Shengyu Wang,&nbsp;Ivan Jarić","doi":"10.1002/fee.2856","DOIUrl":null,"url":null,"abstract":"<p>Large language models (LLMs) are becoming an integral part of our daily work. In the field of ecology, LLMs are already being applied to a wide range of tasks, such as extracting georeferenced data or taxonomic entities from unstructured texts, information synthesis, coding, and teaching (<i>Methods Ecol Evol</i> 2024; <i>Npj Biodivers</i> 2024). Further development and increased use of LLMs in ecology, as in science in general, is likely to intensify and accelerate the process of research and increase publication output—thus pressuring scientists to keep up with the elevated pace, which in turn creates a feedback loop by promoting even greater LLM use.</p><p>However, this all comes at a cost. While not directly borne by end users, aside from occasional response delays, LLMs require considerable computational power and are energy-demanding during both their initial training phase and their subsequent operational use (<i>Nature</i> 2025). Furthermore, partly externalized energy costs are linked to intensive searching and processing of discovered sources as part of Deep Research. Currently, it remains challenging to estimate the total energy costs of LLMs, largely due to limited transparency from their companies of origin.</p><p>The ways to improve LLM sustainability, for example by algorithmic or hardware optimization and renewable energy use during development and operation, have been extensively examined. However, we contend that the role of end users, including researchers, has been largely overlooked. End users can and should be part of the solution, to their own benefit. By selecting less resource-intensive options, optimizing their prompts, or selecting platforms that use renewable energy sources, users would not only contribute to LLM sustainability but also improve their own workflows. Besides reducing energy consumption, a more parsimonious use of LLMs could also lessen other harms, such as cooling water use and extraction of rare earth metals. Consequently, companies should support users in making such informed choices.</p><p>For instance, most companies provide LLMs of different complexities or sizes, often measured by the number of parameters. Relying on the largest models can be excessive in many cases (eg answering emails, checking grammar, or conducting searches that could be done by traditional search engines). By selecting a smaller, less energy-intensive model, users can also benefit from quicker responses. In addition, some smaller models are trained to perform specific tasks, eg coding, and can thus match or outperform bigger ones.</p><p>Another potential way to reduce energy costs and save user time is to trim the expected length and complexity of the model's response. For example, for some questions, an elaborate answer is unnecessary, if not counterproductive (ie because it takes time to read through); in other cases, only code is needed, without any further explanations. And this can be directly specified in the prompt.</p><p>In both everyday conversation with people and interactions with LLMs, an answer is most likely to meet expectations if the request is well formulated. So it is also important to craft prompts to avoid rerunning models multiple times. While a comprehensive overview of optimal prompting lies beyond the scope of this editorial, we suggest a few simple approaches below.</p><p>Often it is advisable to leverage user expertise and provide the model with explicit guidance. For example, with coding, a prompt could include names of the tools (eg libraries or packages for specific coding language) that should be used, or even a basic description of the algorithm, including any known pitfalls, to avoid repeated prompts and regeneration of the whole code. Further, if a requested task requires up-to-date information, or if the tools that the model learned during its training are already outdated, the model can be asked to conduct a web search.</p><p>Whenever a prompt fits its purpose, it should be saved and, if possible, shared. Indeed, several repositories of prompts already exist, some of which include prompts that can be used in research. However, repositories focused on prompts tailored to research in natural sciences are still missing. There are also several prompt marketplaces, in which people offer prompts already optimized for specific tasks, thus reducing the need for repeated trial and error and helping to use less energy and time in the process.</p><p>Likewise, a multitude of browser extensions (plug-ins) have been developed to help with prompt engineering. However, such tools should be built directly into the models’ web interface, so that a more sustainable prompt can more easily be used by a larger number of users. The companies that provide LLMs can indeed play a pivotal role in promoting sustainable LLM use, by giving user-friendly guidance on the most efficient use of their products. In addition, companies should be more transparent about their models’ energy costs (and the energy sources they use) and navigate interested users to select less demanding options or let them choose to run only models powered by renewable energy sources.</p><p>While we believe that companies should not transfer responsibility of LLM sustainability to end users, the users still have an important role to play, by ensuring their use of LLMs is rational and effective, and by pressuring companies to give them the power to do that.</p>","PeriodicalId":171,"journal":{"name":"Frontiers in Ecology and the Environment","volume":"23 5","pages":""},"PeriodicalIF":10.0000,"publicationDate":"2025-06-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1002/fee.2856","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Frontiers in Ecology and the Environment","FirstCategoryId":"93","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1002/fee.2856","RegionNum":1,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ECOLOGY","Score":null,"Total":0}
引用次数: 0

Abstract

Large language models (LLMs) are becoming an integral part of our daily work. In the field of ecology, LLMs are already being applied to a wide range of tasks, such as extracting georeferenced data or taxonomic entities from unstructured texts, information synthesis, coding, and teaching (Methods Ecol Evol 2024; Npj Biodivers 2024). Further development and increased use of LLMs in ecology, as in science in general, is likely to intensify and accelerate the process of research and increase publication output—thus pressuring scientists to keep up with the elevated pace, which in turn creates a feedback loop by promoting even greater LLM use.

However, this all comes at a cost. While not directly borne by end users, aside from occasional response delays, LLMs require considerable computational power and are energy-demanding during both their initial training phase and their subsequent operational use (Nature 2025). Furthermore, partly externalized energy costs are linked to intensive searching and processing of discovered sources as part of Deep Research. Currently, it remains challenging to estimate the total energy costs of LLMs, largely due to limited transparency from their companies of origin.

The ways to improve LLM sustainability, for example by algorithmic or hardware optimization and renewable energy use during development and operation, have been extensively examined. However, we contend that the role of end users, including researchers, has been largely overlooked. End users can and should be part of the solution, to their own benefit. By selecting less resource-intensive options, optimizing their prompts, or selecting platforms that use renewable energy sources, users would not only contribute to LLM sustainability but also improve their own workflows. Besides reducing energy consumption, a more parsimonious use of LLMs could also lessen other harms, such as cooling water use and extraction of rare earth metals. Consequently, companies should support users in making such informed choices.

For instance, most companies provide LLMs of different complexities or sizes, often measured by the number of parameters. Relying on the largest models can be excessive in many cases (eg answering emails, checking grammar, or conducting searches that could be done by traditional search engines). By selecting a smaller, less energy-intensive model, users can also benefit from quicker responses. In addition, some smaller models are trained to perform specific tasks, eg coding, and can thus match or outperform bigger ones.

Another potential way to reduce energy costs and save user time is to trim the expected length and complexity of the model's response. For example, for some questions, an elaborate answer is unnecessary, if not counterproductive (ie because it takes time to read through); in other cases, only code is needed, without any further explanations. And this can be directly specified in the prompt.

In both everyday conversation with people and interactions with LLMs, an answer is most likely to meet expectations if the request is well formulated. So it is also important to craft prompts to avoid rerunning models multiple times. While a comprehensive overview of optimal prompting lies beyond the scope of this editorial, we suggest a few simple approaches below.

Often it is advisable to leverage user expertise and provide the model with explicit guidance. For example, with coding, a prompt could include names of the tools (eg libraries or packages for specific coding language) that should be used, or even a basic description of the algorithm, including any known pitfalls, to avoid repeated prompts and regeneration of the whole code. Further, if a requested task requires up-to-date information, or if the tools that the model learned during its training are already outdated, the model can be asked to conduct a web search.

Whenever a prompt fits its purpose, it should be saved and, if possible, shared. Indeed, several repositories of prompts already exist, some of which include prompts that can be used in research. However, repositories focused on prompts tailored to research in natural sciences are still missing. There are also several prompt marketplaces, in which people offer prompts already optimized for specific tasks, thus reducing the need for repeated trial and error and helping to use less energy and time in the process.

Likewise, a multitude of browser extensions (plug-ins) have been developed to help with prompt engineering. However, such tools should be built directly into the models’ web interface, so that a more sustainable prompt can more easily be used by a larger number of users. The companies that provide LLMs can indeed play a pivotal role in promoting sustainable LLM use, by giving user-friendly guidance on the most efficient use of their products. In addition, companies should be more transparent about their models’ energy costs (and the energy sources they use) and navigate interested users to select less demanding options or let them choose to run only models powered by renewable energy sources.

While we believe that companies should not transfer responsibility of LLM sustainability to end users, the users still have an important role to play, by ensuring their use of LLMs is rational and effective, and by pressuring companies to give them the power to do that.

大型语言模型的可持续性——用户视角
大型语言模型(llm)正在成为我们日常工作中不可或缺的一部分。在生态学领域,法学硕士已经被广泛应用于各种任务,例如从非结构化文本中提取地理参考数据或分类实体,信息合成,编码和教学(方法Ecol evolution 2024;Npj生物潜水员2024)。法学硕士在生态学中的进一步发展和使用的增加,就像在一般科学中一样,可能会加强和加速研究进程,增加出版物的产出——从而迫使科学家跟上加快的步伐,这反过来又通过促进法学硕士的更多使用创造了一个反馈循环。然而,这一切都是有代价的。虽然llm不直接由最终用户承担,但除了偶尔的响应延迟外,llm在初始训练阶段和随后的操作使用中都需要相当大的计算能力和能源需求(Nature 2025)。此外,作为深度研究的一部分,部分外部化的能源成本与已发现资源的密集搜索和处理有关。目前,估计法学硕士的总能源成本仍然具有挑战性,主要原因是其原产公司的透明度有限。提高LLM可持续性的方法,例如通过算法或硬件优化以及在开发和运营期间使用可再生能源,已经得到了广泛的研究。然而,我们认为包括研究人员在内的最终用户的作用在很大程度上被忽视了。最终用户可以而且应该成为解决方案的一部分,这有利于他们自己。通过选择资源密集度较低的选项、优化提示或选择使用可再生能源的平台,用户不仅可以为法学硕士的可持续性做出贡献,还可以改善自己的工作流程。除了减少能源消耗外,更节约地使用llm还可以减少其他危害,例如冷却水的使用和稀土金属的提取。因此,公司应该支持用户做出这种明智的选择。例如,大多数公司提供不同复杂性或规模的法学硕士,通常以参数的数量来衡量。在许多情况下,依赖最大的模型可能是过度的(例如,回复电子邮件、检查语法或执行传统搜索引擎可以完成的搜索)。通过选择更小、能耗更低的型号,用户还可以从更快的响应中受益。此外,一些较小的模型被训练来执行特定的任务,例如编码,因此可以匹配或优于较大的模型。减少能源成本和节省用户时间的另一种潜在方法是减少模型响应的预期长度和复杂性。例如,对于一些问题,一个详尽的答案是不必要的,如果不是适得其反的话(因为它需要时间来阅读);在其他情况下,只需要代码,不需要任何进一步的解释。这可以在提示符中直接指定。无论是在与人的日常交谈中,还是在与法学硕士的互动中,如果请求表述得当,答案最有可能达到预期。因此,制作提示以避免多次运行模型也很重要。虽然对最佳提示的全面概述超出了本文的范围,但我们建议以下几个简单的方法。通常,明智的做法是利用用户的专业知识,并为模型提供明确的指导。例如,对于编码,提示符可以包括应该使用的工具的名称(例如特定编码语言的库或包),甚至是算法的基本描述,包括任何已知的陷阱,以避免重复提示和重新生成整个代码。此外,如果请求的任务需要最新的信息,或者如果模型在训练期间学习的工具已经过时,则可以要求模型进行网络搜索。只要提示符符合其目的,就应该保存它,如果可能的话,还应该共享它。事实上,已经存在几个提示库,其中一些包含可用于研究的提示。然而,专注于为自然科学研究量身定制的提示的知识库仍然缺失。还有一些提示市场,在这些市场中,人们提供已经针对特定任务进行了优化的提示,从而减少了重复试验和错误的需要,并有助于在此过程中节省精力和时间。同样,已经开发了大量浏览器扩展(插件)来帮助进行提示工程。然而,这些工具应该直接构建到模型的web界面中,这样一个更可持续的提示可以更容易地被更多的用户使用。提供法学硕士的公司确实可以在促进法学硕士的可持续使用方面发挥关键作用,通过提供用户友好的指导来最有效地使用其产品。 此外,公司应该对其模型的能源成本(以及它们使用的能源)更加透明,并引导感兴趣的用户选择要求较低的选项,或者让他们选择只运行由可再生能源驱动的模型。虽然我们认为公司不应该将法学硕士可持续性的责任转移给最终用户,但用户仍然可以发挥重要作用,确保他们合理有效地使用法学硕士,并向公司施压,赋予他们这样做的权力。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Frontiers in Ecology and the Environment
Frontiers in Ecology and the Environment 环境科学-环境科学
CiteScore
18.30
自引率
1.00%
发文量
128
审稿时长
9-18 weeks
期刊介绍: Frontiers in Ecology and the Environment is a publication by the Ecological Society of America that focuses on the significance of ecology and environmental science in various aspects of research and problem-solving. The journal covers topics such as biodiversity conservation, ecosystem preservation, natural resource management, public policy, and other related areas. The publication features a range of content, including peer-reviewed articles, editorials, commentaries, letters, and occasional special issues and topical series. It releases ten issues per year, excluding January and July. ESA members receive both print and electronic copies of the journal, while institutional subscriptions are also available. Frontiers in Ecology and the Environment is highly regarded in the field, as indicated by its ranking in the 2021 Journal Citation Reports by Clarivate Analytics. The journal is ranked 4th out of 174 in ecology journals and 11th out of 279 in environmental sciences journals. Its impact factor for 2021 is reported as 13.789, which further demonstrates its influence and importance in the scientific community.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信