Amit Ruhela, Shulei Xu, K. V. Manian, H. Subramoni, D. Panda
{"title":"分析和理解互联性能对高性能计算、大数据和深度学习应用的影响:以InfiniBand EDR和HDR为例","authors":"Amit Ruhela, Shulei Xu, K. V. Manian, H. Subramoni, D. Panda","doi":"10.1109/IPDPSW50202.2020.00147","DOIUrl":null,"url":null,"abstract":"Communication interfaces of High Performance Computing (HPC) systems, Cloud middleware, and Deep Learning (DL) frameworks have been continually evolving to meet the ever-increasing communication demands being placed on them by HPC, Cloud, and DL applications. Modern high performance interconnects like InfiniBand EDR 100 Gbps, InfiniBand HDR 200 Gbps are capable of delivering 100 Gbps and 200 Gbps speeds. However, no previous study has demonstrated how much benefit an end-user in the HPC, Cloud, and DL computing domain can expect by utilizing newer generations of these interconnects over the older ones. In this paper, we evaluate the InfiniBand EDR and HDR high performance interconnects over the PCIe Gen3 interface with HPC, Cloud, and DL workloads. Our comprehensive analysis, done at different levels, provides a global scope of the impact these modern interconnects have on the performance of HPC, Cloud, and DL applications. The results of our experiments show that the latest InfiniBand HDR interconnect gives the best performance for all three computing domains.","PeriodicalId":398819,"journal":{"name":"2020 IEEE International Parallel and Distributed Processing Symposium Workshops (IPDPSW)","volume":"21 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"Analyzing and Understanding the Impact of Interconnect Performance on HPC, Big Data, and Deep Learning Applications: A Case Study with InfiniBand EDR and HDR\",\"authors\":\"Amit Ruhela, Shulei Xu, K. V. Manian, H. Subramoni, D. Panda\",\"doi\":\"10.1109/IPDPSW50202.2020.00147\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Communication interfaces of High Performance Computing (HPC) systems, Cloud middleware, and Deep Learning (DL) frameworks have been continually evolving to meet the ever-increasing communication demands being placed on them by HPC, Cloud, and DL applications. Modern high performance interconnects like InfiniBand EDR 100 Gbps, InfiniBand HDR 200 Gbps are capable of delivering 100 Gbps and 200 Gbps speeds. However, no previous study has demonstrated how much benefit an end-user in the HPC, Cloud, and DL computing domain can expect by utilizing newer generations of these interconnects over the older ones. In this paper, we evaluate the InfiniBand EDR and HDR high performance interconnects over the PCIe Gen3 interface with HPC, Cloud, and DL workloads. Our comprehensive analysis, done at different levels, provides a global scope of the impact these modern interconnects have on the performance of HPC, Cloud, and DL applications. The results of our experiments show that the latest InfiniBand HDR interconnect gives the best performance for all three computing domains.\",\"PeriodicalId\":398819,\"journal\":{\"name\":\"2020 IEEE International Parallel and Distributed Processing Symposium Workshops (IPDPSW)\",\"volume\":\"21 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2020-05-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2020 IEEE International Parallel and Distributed Processing Symposium Workshops (IPDPSW)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/IPDPSW50202.2020.00147\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 IEEE International Parallel and Distributed Processing Symposium Workshops (IPDPSW)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IPDPSW50202.2020.00147","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Analyzing and Understanding the Impact of Interconnect Performance on HPC, Big Data, and Deep Learning Applications: A Case Study with InfiniBand EDR and HDR
Communication interfaces of High Performance Computing (HPC) systems, Cloud middleware, and Deep Learning (DL) frameworks have been continually evolving to meet the ever-increasing communication demands being placed on them by HPC, Cloud, and DL applications. Modern high performance interconnects like InfiniBand EDR 100 Gbps, InfiniBand HDR 200 Gbps are capable of delivering 100 Gbps and 200 Gbps speeds. However, no previous study has demonstrated how much benefit an end-user in the HPC, Cloud, and DL computing domain can expect by utilizing newer generations of these interconnects over the older ones. In this paper, we evaluate the InfiniBand EDR and HDR high performance interconnects over the PCIe Gen3 interface with HPC, Cloud, and DL workloads. Our comprehensive analysis, done at different levels, provides a global scope of the impact these modern interconnects have on the performance of HPC, Cloud, and DL applications. The results of our experiments show that the latest InfiniBand HDR interconnect gives the best performance for all three computing domains.