2010 IEEE Sixth International Conference on e-Science最新文献

筛选
英文 中文
Fault Detection in Distributed Climate Sensor Networks Using Dynamic Bayesian Networks 基于动态贝叶斯网络的分布式气候传感器网络故障检测
2010 IEEE Sixth International Conference on e-Science Pub Date : 2010-12-07 DOI: 10.1109/eScience.2010.22
George Chin, Sutanay Choudhury, L. Kangas, S. McFarlane, A. Márquez
{"title":"Fault Detection in Distributed Climate Sensor Networks Using Dynamic Bayesian Networks","authors":"George Chin, Sutanay Choudhury, L. Kangas, S. McFarlane, A. Márquez","doi":"10.1109/eScience.2010.22","DOIUrl":"https://doi.org/10.1109/eScience.2010.22","url":null,"abstract":"The Atmospheric Radiation Measurement (ARM) program operated by the U.S. Department of Energy is one of the largest climate research programs dedicated to the collection of long-term continuous measurements of cloud properties and other key components of the earth’s climate system. Given the critical role that collected ARM data plays in the analysis of atmospheric processes and conditions and in the enhancement and evaluation of global climate models, the production and distribution of high-quality data is one of ARM’s primary mission objectives. Fault detection in ARM’s distributed sensor network is one critical ingredient towards maintaining high quality and useful data. We are modeling ARM’s distributed sensor network as a dynamic Bayesian network where key measurements are mapped to Bayesian network variables. We then define the conditional dependencies between variables by discovering highly correlated variable pairs from historical data. The resultant dynamic Bayesian network provides an automated approach to identifying whether certain sensors are malfunctioning or failing in the distributed sensor network. A potential fault or failure is detected when an observed measurement is not consistent with its expected measurement and the observed measurements of other related sensors in the Bayesian network. We present some of our experiences and promising results with the fault detection dynamic Bayesian network.","PeriodicalId":441488,"journal":{"name":"2010 IEEE Sixth International Conference on e-Science","volume":"41 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-12-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117351106","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 13
Towards a Framework for Security in eScience 构建电子科学安全框架
2010 IEEE Sixth International Conference on e-Science Pub Date : 2010-12-07 DOI: 10.1109/ESCIENCE.2010.19
Andrew P. Martin, J. Davies, Steve Harris
{"title":"Towards a Framework for Security in eScience","authors":"Andrew P. Martin, J. Davies, Steve Harris","doi":"10.1109/ESCIENCE.2010.19","DOIUrl":"https://doi.org/10.1109/ESCIENCE.2010.19","url":null,"abstract":"This paper describes an approach to the formulation and classification of security requirements in eScience. It explains why it is untenable to suggest that `one size fits all', and that what is an appropriate security solution in one context may not be at all appropriate in another. It proposes a framework for the description of eScience security in a number of different dimensions, in terms of measures taken and controls achieved. A distinctive feature of the framework is that these descriptions are organised into a set of discrete criteria, in most cases presented as levels of increasing assurance. The intended framework should serve as a basis for the systematic analysis of security solutions, facilitating the processes of design and approval, as well as for the identification of expectations and best practice in particular domains. The possible usage of the framework, and the value of the approach, is demonstrated in the paper through application to the design of a national data sharing service.","PeriodicalId":441488,"journal":{"name":"2010 IEEE Sixth International Conference on e-Science","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-12-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122358751","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 9
Scaling Benchmark of ESyS-Particle for Elastic Wave Propagation Simulations 弹性波传播模拟中ESyS-Particle的尺度基准
2010 IEEE Sixth International Conference on e-Science Pub Date : 2010-12-07 DOI: 10.1109/eScience.2010.40
D. Weatherley, V. Boros, W. Hancock, S. Abe
{"title":"Scaling Benchmark of ESyS-Particle for Elastic Wave Propagation Simulations","authors":"D. Weatherley, V. Boros, W. Hancock, S. Abe","doi":"10.1109/eScience.2010.40","DOIUrl":"https://doi.org/10.1109/eScience.2010.40","url":null,"abstract":"The Discrete Element Method (DEM) is a popular particle-based numerical method for simulating geophysical processes including earthquakes, rock breakage and granular flow. Often simulations consisting of thousands of particles have insufficient resolution to reproduce the micromechanics of many geophysical processes, requiring millions of particles in some instances. The high computational expense of the DEM precludes execution of such problem sizes on desktop PCs. ESyS-Particle is a parallel implementation of the DEM, designed for execution on cluster supercomputers. Three-dimensional spatial domain decomposition is implemented using the MPI for interprocess communications. We present results of scaling benchmarks in which problem size per worker remains constant. As the number of workers increases from 27 to 1000, execution time remains near-constant, permitting simulations of 8.7M particles in approximately the same real time as simulations comprising 240K particles.","PeriodicalId":441488,"journal":{"name":"2010 IEEE Sixth International Conference on e-Science","volume":"38 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-12-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122697600","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 17
Scalable Social Simulation: Investigating Population-Scale Phenomena Using Commodity Computing 可扩展的社会模拟:使用商品计算调查人口规模现象
2010 IEEE Sixth International Conference on e-Science Pub Date : 2010-12-07 DOI: 10.1109/ESCIENCE.2010.46
A. Voss, Jing-Ya You, E. Yen, Hsin-Yen Chen, Simon C. Lin, A. Turner, Ji-Ping Lin
{"title":"Scalable Social Simulation: Investigating Population-Scale Phenomena Using Commodity Computing","authors":"A. Voss, Jing-Ya You, E. Yen, Hsin-Yen Chen, Simon C. Lin, A. Turner, Ji-Ping Lin","doi":"10.1109/ESCIENCE.2010.46","DOIUrl":"https://doi.org/10.1109/ESCIENCE.2010.46","url":null,"abstract":"Agent-based simulation models provide a way to investigate social phenomena that complement existing social science methods. Recent advances in computing hardware such as the wide availability of multi-core CPUs and increased main memory capacities make it possible to investigate population-scale phenomena using commodity compute resources. This paper describes experiences made in the development of an example model that utilises multiple CPU cores and an investigation of the scalability of the resulting code. We argue that commodity compute resources and commoditised simulation frameworks can now be used to simulate real-world populations and to use these simulations to investigate social phenomena such as migration.","PeriodicalId":441488,"journal":{"name":"2010 IEEE Sixth International Conference on e-Science","volume":"10 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-12-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116466494","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 5
Tracking and Sketching Distributed Data Provenance 跟踪和绘制分布式数据来源
2010 IEEE Sixth International Conference on e-Science Pub Date : 2010-12-07 DOI: 10.1109/eScience.2010.51
T. Malik, L. Nistor, Ashish Gehani
{"title":"Tracking and Sketching Distributed Data Provenance","authors":"T. Malik, L. Nistor, Ashish Gehani","doi":"10.1109/eScience.2010.51","DOIUrl":"https://doi.org/10.1109/eScience.2010.51","url":null,"abstract":"Current provenance collection systems typically gather metadata on remote hosts and submit it to a central server. In contrast, several data-intensive scientific applications require a decentralized architecture in which each host maintains an authoritative local repository of the provenance metadata gathered on that host. The latter approach allows the system to handle the large amounts of metadata generated when auditing occurs at fine granularity, and allows users to retain control over their provenance records. The decentralized architecture, however, increases the complexity of auditing, tracking, and querying distributed provenance. We describe a system for capturing data provenance in distributed applications, and the use of provenance sketches to optimize subsequent data provenance queries. Experiments with data gathered from distributed workflow applications demonstrate the feasibility of a decentralized provenance management system and improvements in the efficiency of provenance queries.","PeriodicalId":441488,"journal":{"name":"2010 IEEE Sixth International Conference on e-Science","volume":"102 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-12-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122618476","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 33
Humanities e-Science: From Systematic Investigations to Institutional Infrastructures 人文电子科学:从系统调查到制度基础
2010 IEEE Sixth International Conference on e-Science Pub Date : 2010-12-07 DOI: 10.1109/eScience.2010.34
Tobias Blanke, M. Hedges
{"title":"Humanities e-Science: From Systematic Investigations to Institutional Infrastructures","authors":"Tobias Blanke, M. Hedges","doi":"10.1109/eScience.2010.34","DOIUrl":"https://doi.org/10.1109/eScience.2010.34","url":null,"abstract":"In this paper we bring together the results of a number of humanities e-research projects at King's College London. This programme of work was not carried out in an ad hoc manner, but was built on a rigorous methodological foundation, firstly by ensuring that the work was thoroughly grounded in the practice of humanities researchers (including 'digitally-aware' humanists), and secondly by analysing these practices in terms of 'scholarly primitives', basic activities common to research across humanities disciplines. The projects were then undertaken to provide systems and services that support various of these primitives, with a view to developing a research infrastructure constructed from these components, which may be regarded as a 'production line' for humanities research, supporting research activities from the creation of primary sources in digital form through to the publication of research outputs for discussion and re-use.","PeriodicalId":441488,"journal":{"name":"2010 IEEE Sixth International Conference on e-Science","volume":"3 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-12-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128033483","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 6
The Evolution of myExperiment 我的实验的演变
2010 IEEE Sixth International Conference on e-Science Pub Date : 2010-12-07 DOI: 10.1109/ESCIENCE.2010.59
D. D. Roure, C. Goble, Sergejs Aleksejevs, S. Bechhofer, Jiten Bhagat, Don Cruickshank, Paul Fisher, Nandkumar Kollara, D. Michaelides, P. Missier, David R. Newman, Marcus Ramsden, M. Roos, K. Wolstencroft, E. Zaluska, Jun Zhao
{"title":"The Evolution of myExperiment","authors":"D. D. Roure, C. Goble, Sergejs Aleksejevs, S. Bechhofer, Jiten Bhagat, Don Cruickshank, Paul Fisher, Nandkumar Kollara, D. Michaelides, P. Missier, David R. Newman, Marcus Ramsden, M. Roos, K. Wolstencroft, E. Zaluska, Jun Zhao","doi":"10.1109/ESCIENCE.2010.59","DOIUrl":"https://doi.org/10.1109/ESCIENCE.2010.59","url":null,"abstract":"The myExperiment social website for sharing scientific workflows, designed according to Web 2.0 principles, has grown to be the largest public repository of its kind. It is distinctive for its focus on sharing methods, its researcher-centric design and its facility to aggregate content into sharable ‘research objects’. This evolution of myExperiment has occurred hand in hand with its users. myExperiment now supports Linked Data as a step toward our vision of the future research environment, which we categorise here as 3rd generation e-Research.","PeriodicalId":441488,"journal":{"name":"2010 IEEE Sixth International Conference on e-Science","volume":"14 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-12-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129682207","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 22
PODD - Towards an Extensible, Domain-Agnostic Scientific Data Management System 面向可扩展的、领域不可知的科学数据管理系统
2010 IEEE Sixth International Conference on e-Science Pub Date : 2010-12-01 DOI: 10.1109/eScience.2010.44
Yuan-Fang Li, Gavin Kennedy, Faith Davies, J. Hunter
{"title":"PODD - Towards an Extensible, Domain-Agnostic Scientific Data Management System","authors":"Yuan-Fang Li, Gavin Kennedy, Faith Davies, J. Hunter","doi":"10.1109/eScience.2010.44","DOIUrl":"https://doi.org/10.1109/eScience.2010.44","url":null,"abstract":"Data management has become a critical challenge faced by a wide array of scientific disciplines in which the provision of sound data management is pivotal to the achievements and impact of research projects. Massive and rapidly expanding amounts of data combined with data models that evolve over time contribute to making data management an increasingly challenging task that warrants a rethinking of its design. In this paper we present PODD, an ontology-centric architecture for data management systems that is extensible and domain independent. In this architecture, the behaviors of domain concepts and objects are captured entirely by ontological entities, around which all data management tasks are carried out. The open and semantic nature of ontology languages also makes PODD amenable to greater data reuse and interoperability. To evaluate the PODD architecture, we have applied it to the challenge of managing phenomics data.","PeriodicalId":441488,"journal":{"name":"2010 IEEE Sixth International Conference on e-Science","volume":"5 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115415549","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 11
Design and Implementation of GXP Make -- A Workflow System Based on Make 基于Make的工作流系统GXP Make的设计与实现
2010 IEEE Sixth International Conference on e-Science Pub Date : 2010-12-01 DOI: 10.1109/eScience.2010.43
K. Taura, Takuya Matsuzaki, Makoto Miwa, Yoshikazu Kamoshida, Daisaku Yokoyama, N. Dun, Takeshi Shibata, Choi Sung Jun, Junichi Tsujii
{"title":"Design and Implementation of GXP Make -- A Workflow System Based on Make","authors":"K. Taura, Takuya Matsuzaki, Makoto Miwa, Yoshikazu Kamoshida, Daisaku Yokoyama, N. Dun, Takeshi Shibata, Choi Sung Jun, Junichi Tsujii","doi":"10.1109/eScience.2010.43","DOIUrl":"https://doi.org/10.1109/eScience.2010.43","url":null,"abstract":"This paper describes the rational behind designing workflow systems based on the Unix make by showing a number of idioms useful for workflows comprising many tasks. It also demonstrates a specific design and implementation of such a workflow system called GXP make. GXP make supports all the features of GNU make and extends its platforms from single node systems to clusters, clouds, supercomputers, and distributed systems. Interestingly, it is achieved by a very small code base that does not modify GNU make implementation at all. While being not ideal for performance, it achieved a useful performance and scalability of dispatching one million tasks in approximately 16,000 seconds (60 tasks per second, including dependence analysis) on an 8 core Intel Nehalem node. For real applications, recognition and classification of protein-protein interactions from biomedical texts on a supercomputer with more than 8,000 cores are described.","PeriodicalId":441488,"journal":{"name":"2010 IEEE Sixth International Conference on e-Science","volume":"27 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126764601","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 28
A Local Sensitivity Analysis Method for Developing Biological Models with Identifiable Parameters: Application to L-type Calcium Channel Modelling 建立具有可识别参数的生物模型的局部敏感性分析方法:在l型钙通道建模中的应用
2010 IEEE Sixth International Conference on e-Science Pub Date : 2010-12-01 DOI: 10.1109/ESCIENCE.2010.56
Anna Sher, Ken Wang, A. Wathen, Gary R. Mirams, D. Abramson, D. Gavaghan
{"title":"A Local Sensitivity Analysis Method for Developing Biological Models with Identifiable Parameters: Application to L-type Calcium Channel Modelling","authors":"Anna Sher, Ken Wang, A. Wathen, Gary R. Mirams, D. Abramson, D. Gavaghan","doi":"10.1109/ESCIENCE.2010.56","DOIUrl":"https://doi.org/10.1109/ESCIENCE.2010.56","url":null,"abstract":"Computational cardiac models provide important insights into the underlying mechanisms of heart function. Parameter estimation in these models is an ongoing challenge with many existing models being overparameterised. Sensitivity analysis presents a key tool for exploring the parameter identifiability. While existing methods provide insight into the significance of the parameters, they are unable to identify redundant parameters in an efficient manner. We present a new singular value decomposition based algorithm for determining parameter identifiability in cardiac models. Using this local sensitivity approach, we investigate the Mahajan 2008 rabbit ventricular myocyte L-type calcium current model. We identify non-significant and redundant parameters and improve the Ical model by reducing it to a minimum one that is validated to have only identifiable parameters. The newly proposed approach provides a new method for model validation and evaluation of the predictive power of cardiac models.","PeriodicalId":441488,"journal":{"name":"2010 IEEE Sixth International Conference on e-Science","volume":"16 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123916474","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信