Proceedings of the 4th International Workshop on Practical Reproducible Evaluation of Computer Systems最新文献

筛选
英文 中文
Artifact Description/Artifact Evaluation: A Reproducibility Bane or a Boon 人工制品描述/人工制品评估:可复制性的缺点或优点
T. Malik
{"title":"Artifact Description/Artifact Evaluation: A Reproducibility Bane or a Boon","authors":"T. Malik","doi":"10.1145/3456287.3465479","DOIUrl":"https://doi.org/10.1145/3456287.3465479","url":null,"abstract":"Several systems research conferences now incorporate an artifact description and artifact evaluation (AD/AE) process as part of the paper submission. Authors of accepted papers optionally submit a plethora of artifacts: documentation, links, tools, code, data, and scripts for independent validation of the claims in their paper. An artifact evaluation committee (AEC) evaluates the artifacts and stamps papers with accepted artifacts, which then receive publisher badges. Does this AD/AE process serve authors and reviewers? Is it scalable for large conferences such as SCxy? Using the last three SCxy Reproducibility Initiatives as the basis, this talk will analyze the benefits and the miseries of the AD/AE process. Several systems research conferences now incorporate an artifact description and artifact evaluation (AD/AE) process as part of the paper submission. Authors of accepted papers optionally submit a plethora of artifacts: documentation, links, tools, code, data, and scripts for independent validation of the claims in their paper. An artifact evaluation committee (AEC) evaluates the artifacts and stamps papers with accepted artifacts, which then receive publisher badges. Does this AD/AE process serve authors and reviewers? Is it scalable for large conferences such as SCxy? Using the last three SCxy Reproducibility Initiatives as the basis, this talk will analyze the benefits and the miseries of the AD/AE process. We will present a data-driven approach, using survey results to analyze technical and human challenges in conducting the AD/AE process. Our method will distinguish studies that benefit from AD, i.e., increased transparency versus areas that benefit from AE. The AD/AE research objects [1] present an interesting set of data management and systems challenges [2,3]. We will look under the hood of the research objects, describe prominent characteristics, and how cloud infrastructures, documented workflows, and reproducible containers [4] ease some of the AD/AE process hand-shakes. Finally, we will present a vision for the resulting curated, reusable research objects---how such research objects are a treasure in themselves for advancing computational reproducibility and making reproducible evaluation practical in the coming years.","PeriodicalId":419516,"journal":{"name":"Proceedings of the 4th International Workshop on Practical Reproducible Evaluation of Computer Systems","volume":"905 ","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-06-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"120880361","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Experiences with Reproducibility: Case Studies from Scientific Workflows 再现性经验:来自科学工作流程的案例研究
D. Ghoshal, Drew Paine, G. Pastorello, Abdelrahman Elbashandy, D. Gunter, O. Amusat, L. Ramakrishnan
{"title":"Experiences with Reproducibility: Case Studies from Scientific Workflows","authors":"D. Ghoshal, Drew Paine, G. Pastorello, Abdelrahman Elbashandy, D. Gunter, O. Amusat, L. Ramakrishnan","doi":"10.1145/3456287.3465478","DOIUrl":"https://doi.org/10.1145/3456287.3465478","url":null,"abstract":"Reproducible research is becoming essential for science to ensure transparency and for building trust. Additionally, reproducibility provides the cornerstone for sharing of methodology that can improve efficiency. Although several tools and studies focus on computational reproducibility, we need a better understanding about the gaps, issues, and challenges for enabling reproducibility of scientific results beyond the computational stages of a scientific pipeline. In this paper, we present five different case studies that highlight the reproducibility needs and challenges under various system and environmental conditions. Through the case studies, we present our experiences in reproducing different types of data and methods that exist in an experimental or analysis pipeline. We examine the human aspects of reproducibility while highlighting the things that worked, that did not work, and that could have worked better for each of the cases. Our experiences capture a wide range of scenarios and are applicable to a much broader audience who aim to integrate reproducibility in their everyday pipelines.","PeriodicalId":419516,"journal":{"name":"Proceedings of the 4th International Workshop on Practical Reproducible Evaluation of Computer Systems","volume":"44 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-06-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116483949","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
HELIPORT 直升飞机场
Oliver Knodel, M. Voigt, Robert Ufer, David Pape, M. Lokamani, Stefan E. Müller, Thomas Gruber, G. Juckeland
{"title":"HELIPORT","authors":"Oliver Knodel, M. Voigt, Robert Ufer, David Pape, M. Lokamani, Stefan E. Müller, Thomas Gruber, G. Juckeland","doi":"10.1145/3456287.3465477","DOIUrl":"https://doi.org/10.1145/3456287.3465477","url":null,"abstract":"Modern scientific collaborations and projects (MSCPs) employ various processing stages, starting with the proposal submission, continuing with data acquisition and concluding with final publications. The realization of such MSCPs poses a huge challenge due to (1) the complexity and diversity of the tools, (2) the heterogeneity of various involved computing and experimental platforms, (3) flexibility of analysis targets towards data acquisition and (4) data throughput. Another challenge for MSCPs is to provide additional metadata according to the FAIR principles for all processing stages for internal and external use. Consequently, the demand for a system, that assists the scientist in all project stages and archives all processes on the basis of metadata standards like DataCite to make really everything transparent, understandable and citable, has risen considerably. The aim of this project is the development of the HELmholtz ScIentific Project WORkflow PlaTform (HELIPORT), which ensures data provenance by accommodating the complete life cycle of a scientific project and linking all employed programs and systems. The modular structure of HELIPORT enables the deployment of the core applications to different Helmholtz centers (HZs) and can be adapted to center-specific needs simply by adding or replacing individual components. HELIPORT is based on modern web technologies and can be used on different platforms.","PeriodicalId":419516,"journal":{"name":"Proceedings of the 4th International Workshop on Practical Reproducible Evaluation of Computer Systems","volume":"17 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-06-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116493894","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Proceedings of the 4th International Workshop on Practical Reproducible Evaluation of Computer Systems 第四届计算机系统实用可重复评估国际研讨会论文集
{"title":"Proceedings of the 4th International Workshop on Practical Reproducible Evaluation of Computer Systems","authors":"","doi":"10.1145/3456287","DOIUrl":"https://doi.org/10.1145/3456287","url":null,"abstract":"","PeriodicalId":419516,"journal":{"name":"Proceedings of the 4th International Workshop on Practical Reproducible Evaluation of Computer Systems","volume":"123 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132126147","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信