Numerical coupling of energy efficiency and thermal performance for cold plate cooling optimization in high-density compute AI data centers

IF 7.1 2区 工程技术 Q1 CONSTRUCTION & BUILDING TECHNOLOGY
Jinkyun Cho, Joo Hyun Moon
{"title":"Numerical coupling of energy efficiency and thermal performance for cold plate cooling optimization in high-density compute AI data centers","authors":"Jinkyun Cho,&nbsp;Joo Hyun Moon","doi":"10.1016/j.enbuild.2025.116441","DOIUrl":null,"url":null,"abstract":"<div><div>The rapid growth of AI-driven, high-density data centers has pushed conventional air cooling to its operational limits, creating an urgent need for more efficient thermal management solutions. This study develops a coupled numerical framework that integrates CFD-based thermal analysis at the component level with system- and building-level energy performance evaluation to optimize cold plate cooling systems for a 30 MW-class data center. A total of 49 matrix cases were simulated using a k–ε turbulence model, varying coolant supply temperature (S-Class) and flow rate to assess the thermal stability of high-power chips and the associated pressure drop. These CFD results were then translated into the sizing of key cooling system components, including the Technology Cooling System (TCS), Facility Water System (FWS), and Condenser Water System (CWS), from which PUE<sub>cooling</sub> was calculated. The findings show that higher flow rates enhance chip temperature stability but increase coolant pump power due to greater pressure drop, requiring a balance between thermal safety and energy efficiency. At the system level, all liquid cooling cases outperformed the conventional air-cooled baseline (PUE = 1.60). Optimized operating conditions achieved PUE<sub>cooling</sub> values below 1.1, representing significant efficiency gains. This work demonstrates the novelty of numerically coupling component-level thermal performance with system-level energy analysis for large-scale AI data centers. The methodology provides practical design insights for identifying operating ranges that ensure both thermal safety of high-power chips and energy-efficient cooling, offering a scalable and sustainable solution for next-generation data center operations.</div></div>","PeriodicalId":11641,"journal":{"name":"Energy and Buildings","volume":"348 ","pages":"Article 116441"},"PeriodicalIF":7.1000,"publicationDate":"2025-09-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Energy and Buildings","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0378778825011715","RegionNum":2,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"CONSTRUCTION & BUILDING TECHNOLOGY","Score":null,"Total":0}
引用次数: 0

Abstract

The rapid growth of AI-driven, high-density data centers has pushed conventional air cooling to its operational limits, creating an urgent need for more efficient thermal management solutions. This study develops a coupled numerical framework that integrates CFD-based thermal analysis at the component level with system- and building-level energy performance evaluation to optimize cold plate cooling systems for a 30 MW-class data center. A total of 49 matrix cases were simulated using a k–ε turbulence model, varying coolant supply temperature (S-Class) and flow rate to assess the thermal stability of high-power chips and the associated pressure drop. These CFD results were then translated into the sizing of key cooling system components, including the Technology Cooling System (TCS), Facility Water System (FWS), and Condenser Water System (CWS), from which PUEcooling was calculated. The findings show that higher flow rates enhance chip temperature stability but increase coolant pump power due to greater pressure drop, requiring a balance between thermal safety and energy efficiency. At the system level, all liquid cooling cases outperformed the conventional air-cooled baseline (PUE = 1.60). Optimized operating conditions achieved PUEcooling values below 1.1, representing significant efficiency gains. This work demonstrates the novelty of numerically coupling component-level thermal performance with system-level energy analysis for large-scale AI data centers. The methodology provides practical design insights for identifying operating ranges that ensure both thermal safety of high-power chips and energy-efficient cooling, offering a scalable and sustainable solution for next-generation data center operations.
高密度计算AI数据中心冷板冷却优化的能效与热性能数值耦合
人工智能驱动的高密度数据中心的快速增长已经将传统的空气冷却推向了极限,迫切需要更高效的热管理解决方案。本研究开发了一个耦合数值框架,该框架将基于cfd的组件级热分析与系统和建筑级能源性能评估相结合,以优化30兆瓦级数据中心的冷板冷却系统。采用k -ε湍流模型,在不同的冷却剂供应温度(s级)和流量下,模拟了49种矩阵情况,以评估大功率芯片的热稳定性和相关的压降。然后将这些CFD结果转化为关键冷却系统部件的尺寸,包括技术冷却系统(TCS)、设施水系统(FWS)和冷凝器水系统(CWS), PUEcooling以此为基础进行计算。研究结果表明,更高的流量增强了芯片温度的稳定性,但由于更大的压降,冷却剂泵的功率也会增加,这需要在热安全和能源效率之间取得平衡。在系统层面,所有液冷工况的性能均优于传统的风冷基准(PUE = 1.60)。优化后的运行条件使PUEcooling值低于1.1,效率显著提高。这项工作证明了大规模人工智能数据中心的组件级热性能与系统级能量分析数值耦合的新颖性。该方法为确定操作范围提供了实用的设计见解,以确保高功率芯片的热安全性和节能冷却,为下一代数据中心运营提供了可扩展和可持续的解决方案。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Energy and Buildings
Energy and Buildings 工程技术-工程:土木
CiteScore
12.70
自引率
11.90%
发文量
863
审稿时长
38 days
期刊介绍: An international journal devoted to investigations of energy use and efficiency in buildings Energy and Buildings is an international journal publishing articles with explicit links to energy use in buildings. The aim is to present new research results, and new proven practice aimed at reducing the energy needs of a building and improving indoor environment quality.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信