SoftwareX最新文献

筛选
英文 中文
S-LiNE: An open-source LiDAR toolbox for dune coasts shoreline mapping S-LiNE:用于沙丘海岸海岸线测绘的开源激光雷达工具箱
IF 2.4 4区 计算机科学
SoftwareX Pub Date : 2025-07-10 DOI: 10.1016/j.softx.2025.102261
Jakub Śledziowski , Andrzej Giza , Paweł Terefenko
{"title":"S-LiNE: An open-source LiDAR toolbox for dune coasts shoreline mapping","authors":"Jakub Śledziowski ,&nbsp;Andrzej Giza ,&nbsp;Paweł Terefenko","doi":"10.1016/j.softx.2025.102261","DOIUrl":"10.1016/j.softx.2025.102261","url":null,"abstract":"<div><div>This paper presents an open-source toolbox designed to streamline shoreline detection and analysis directly from Light Detection and Ranging (LiDAR) raw point clouds in LAS format. The application is based on Python scripts and supports LiDAR datasets from both unmanned aerial vehicles (UAV) and airborne laser scanning (ALS). It performs key processing steps including elevation correction using a geoid model (for UAV data), shoreline extraction based on point cloud characteristics (intensity, red-green-blue (RGB) values, scan angle – for UAV data, and classification – for ALS data), and statistical comparison of shoreline positions over time. The tool features a graphical user interface built with Streamlit, enabling users to operate it without any programming experience. By eliminating the need for raster generation and external classification, the tool significantly reduces processing time while ensuring reproducibility. Output files are saved in widely used formats compatible with Geographic Information System (GIS), including GeoJSON, SHP, and CSV. The toolbox addresses a key gap in coastal monitoring workflows, offering a scalable, user-friendly solution for researchers and practitioners working with high-resolution coastal data.</div></div>","PeriodicalId":21905,"journal":{"name":"SoftwareX","volume":"31 ","pages":"Article 102261"},"PeriodicalIF":2.4,"publicationDate":"2025-07-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144587489","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
quicR: An R library for streamlined data handling of real-time quaking induced conversion assays quickr:一个R库,用于实时地震诱导转换分析的流线型数据处理
IF 2.4 4区 计算机科学
SoftwareX Pub Date : 2025-07-07 DOI: 10.1016/j.softx.2025.102247
Gage R. Rowden, Peter A. Larsen
{"title":"quicR: An R library for streamlined data handling of real-time quaking induced conversion assays","authors":"Gage R. Rowden,&nbsp;Peter A. Larsen","doi":"10.1016/j.softx.2025.102247","DOIUrl":"10.1016/j.softx.2025.102247","url":null,"abstract":"<div><div>Real-time quaking induced conversion (RT-QuIC) has become a valuable diagnostic tool for protein misfolding disorders such as Creutzfeldt–Jakob disease and Parkinson’s disease. Given that the technology is relatively new, academic and industry standards for quality filtering data and high throughput analysis of results have yet to be fully established. The open source R library, <strong>quicR</strong>, was developed to provide a standardized approach to RT-QuIC data analysis. <strong>quicR</strong> provides functions, which can be easily integrated into existing R workflows, for data curation, analysis, and visualization.</div></div>","PeriodicalId":21905,"journal":{"name":"SoftwareX","volume":"31 ","pages":"Article 102247"},"PeriodicalIF":2.4,"publicationDate":"2025-07-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144570028","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Marimba: A Python framework for structuring and processing FAIR scientific image datasets Marimba:用于构建和处理FAIR科学图像数据集的Python框架
IF 2.4 4区 计算机科学
SoftwareX Pub Date : 2025-07-03 DOI: 10.1016/j.softx.2025.102251
Christopher J. Jackett , Kevin Barnard , Franziska Althaus , Nicolas Mortimer , David Webb , Candice Untiedt , Aaron Tyndall , Ian Jameson , Bec Gorton , Carlie Devine , Joanna Strzelecki , Peter H. Thrall , Ben Scoulding
{"title":"Marimba: A Python framework for structuring and processing FAIR scientific image datasets","authors":"Christopher J. Jackett ,&nbsp;Kevin Barnard ,&nbsp;Franziska Althaus ,&nbsp;Nicolas Mortimer ,&nbsp;David Webb ,&nbsp;Candice Untiedt ,&nbsp;Aaron Tyndall ,&nbsp;Ian Jameson ,&nbsp;Bec Gorton ,&nbsp;Carlie Devine ,&nbsp;Joanna Strzelecki ,&nbsp;Peter H. Thrall ,&nbsp;Ben Scoulding","doi":"10.1016/j.softx.2025.102251","DOIUrl":"10.1016/j.softx.2025.102251","url":null,"abstract":"<div><div>The rapid advancement of scientific imaging technologies has created significant challenges in managing large-scale image datasets while maintaining compliance with FAIR (Findable, Accessible, Interoperable, and Reusable) data principles. We present Marimba, an open-source Python framework for structuring, processing, and packaging scientific image datasets. Marimba enhances data management through unified workflow processing, automated metadata embedding, efficient data handling, and standardized dataset packaging while integrating with the image FAIR Digital Object (iFDO) metadata standard. The framework's capabilities were evaluated through four diverse marine case studies involving multi-instrument microscopy, automated plankton imagery, deep-sea coral surveys, and historical image digitization. Marimba successfully processed datasets ranging from thousands to hundreds of thousands of images and videos, demonstrating robust performance and scalability. Marimba's modular architecture enables customization for specific research requirements while ensuring consistent data management practices. Results demonstrate Marimba's potential to advance scientific image data management by improving workflow efficiency, data quality, and adherence to FAIR principles throughout the research data lifecycle.</div></div>","PeriodicalId":21905,"journal":{"name":"SoftwareX","volume":"31 ","pages":"Article 102251"},"PeriodicalIF":2.4,"publicationDate":"2025-07-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144534720","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
An Item Response Theory-based R module for Algorithm Portfolio Analysis 基于项目反应理论的算法组合分析R模块
IF 2.4 4区 计算机科学
SoftwareX Pub Date : 2025-07-02 DOI: 10.1016/j.softx.2025.102239
Brodie Oldfield , Sevvandi Kandanaarachchi , Ziqi Xu , Mario Andrés Muñoz
{"title":"An Item Response Theory-based R module for Algorithm Portfolio Analysis","authors":"Brodie Oldfield ,&nbsp;Sevvandi Kandanaarachchi ,&nbsp;Ziqi Xu ,&nbsp;Mario Andrés Muñoz","doi":"10.1016/j.softx.2025.102239","DOIUrl":"10.1016/j.softx.2025.102239","url":null,"abstract":"<div><div>Experimental evaluation is crucial in AI research, especially for assessing algorithms across diverse tasks. Many studies often evaluate a limited set of algorithms, failing to fully understand their strengths and weaknesses within a comprehensive portfolio. This paper introduces an Item Response Theory (IRT) based analysis tool for algorithm portfolio evaluation called AIRT-Module. Traditionally used in educational psychometrics, IRT models test question difficulty and student ability using responses to test questions. Adapting IRT to algorithm evaluation, the AIRT-Module contains a Shiny web application and the R package <span>airt</span>. AIRT-Module uses algorithm performance measures to compute anomalousness, consistency, and difficulty limits for an algorithm and the difficulty of test instances. The strengths and weaknesses of algorithms are visualised using the difficulty spectrum of the test instances. AIRT-Module offers a detailed understanding of algorithm capabilities across varied test instances, thus enhancing comprehensive AI method assessment. It is available at <span><span>https://sevvandi.shinyapps.io/AIRT/</span><svg><path></path></svg></span>.</div></div>","PeriodicalId":21905,"journal":{"name":"SoftwareX","volume":"31 ","pages":"Article 102239"},"PeriodicalIF":2.4,"publicationDate":"2025-07-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144523048","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
PhaFiDyn: An explicit dynamic phase field damage model implementation PhaFiDyn:一个显式的动态相场损伤模型实现
IF 2.4 4区 计算机科学
SoftwareX Pub Date : 2025-07-01 DOI: 10.1016/j.softx.2025.102227
A. Barki, J. Zghal, L. Gallimard, I. Bruant, L. Davenne
{"title":"PhaFiDyn: An explicit dynamic phase field damage model implementation","authors":"A. Barki,&nbsp;J. Zghal,&nbsp;L. Gallimard,&nbsp;I. Bruant,&nbsp;L. Davenne","doi":"10.1016/j.softx.2025.102227","DOIUrl":"10.1016/j.softx.2025.102227","url":null,"abstract":"<div><div>Predicting the critical load that structures can sustain and the right crack path is an important scientific and practical issue. This reason is the trigger for the appearance of the damage mechanics. During these last decades, many models have shown their ability to predict both critical loads and crack paths. We can cite thick-level set models (TLS), phase field damage models, peridynamics, and more recently Lip-Field damage model. This work aims to show the capability of the PhaFiDyn software to predict crack paths with the unified formulation of the phase field damage model. PhaFiDyn is implemented on <span>FeniCS</span>, an open-source finite element library. Validation test is realized with numerical and experimental results from the literature. PhaFiDyn is easily modular, allowing for expansion of this implementation.</div></div>","PeriodicalId":21905,"journal":{"name":"SoftwareX","volume":"31 ","pages":"Article 102227"},"PeriodicalIF":2.4,"publicationDate":"2025-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144517948","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Mhorseshoe package in R: Approximate algorithm for the horseshoe prior in Bayesian linear model R中的马蹄包:贝叶斯线性模型中马蹄先验的近似算法
IF 2.4 4区 计算机科学
SoftwareX Pub Date : 2025-06-30 DOI: 10.1016/j.softx.2025.102236
Mingi Kang, Kyoungjae Lee
{"title":"Mhorseshoe package in R: Approximate algorithm for the horseshoe prior in Bayesian linear model","authors":"Mingi Kang,&nbsp;Kyoungjae Lee","doi":"10.1016/j.softx.2025.102236","DOIUrl":"10.1016/j.softx.2025.102236","url":null,"abstract":"<div><div>The horseshoe prior is a continuous shrinkage prior frequently used in high-dimensional Bayesian sparse linear regression models. Although the horseshoe prior theoretically guarantees excellent shrinkage properties, performing a Markov Chain Monte Carlo (MCMC) algorithm incurs high computational costs per iteration. We introduce the <span>Mhorseshoe</span> package in R, which implements posterior inference under the horseshoe prior, based on the exact and approximate algorithms proposed in Johndrow et al. (2020). Furthermore, this package incorporates a novel adaptive selection method, which we developed and implemented to determine the tuning parameter in the approximate algorithm. We conducted a simulation study and confirmed that the algorithm can be effectively applied to large datasets.</div></div>","PeriodicalId":21905,"journal":{"name":"SoftwareX","volume":"31 ","pages":"Article 102236"},"PeriodicalIF":2.4,"publicationDate":"2025-06-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144514114","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Population potential on catchment area (PPCA): A Python-based tool for worldwide geospatial population analysis 集水区人口潜力(PPCA):基于python的全球地理空间人口分析工具
IF 2.4 4区 计算机科学
SoftwareX Pub Date : 2025-06-28 DOI: 10.1016/j.softx.2025.102245
Joan Perez, Giovanni Fusco
{"title":"Population potential on catchment area (PPCA): A Python-based tool for worldwide geospatial population analysis","authors":"Joan Perez,&nbsp;Giovanni Fusco","doi":"10.1016/j.softx.2025.102245","DOIUrl":"10.1016/j.softx.2025.102245","url":null,"abstract":"<div><div>The Population Potential in Catchment Area (PPCA) protocol is a Python-based methodology designed to evaluate and analyze population distributions within specified pedestrian catchment areas globally. PPCA utilizes OpenStreetMap (OSM) and Global Human Settlement (GHS) data and employs Google Earth Engine for data acquisition and morphometric analysis. Through a series of four automated steps, the protocol cleans, processes, and classifies geospatial data, ultimately yielding refined population estimations within defined catchment regions. This protocol enables researchers and urban planners to assess the population that can be potentially accessed on foot using the street network, within given distances. The protocol allows this assessment globally with minimal input requirements, focusing mostly on bounding box coordinates.</div></div>","PeriodicalId":21905,"journal":{"name":"SoftwareX","volume":"31 ","pages":"Article 102245"},"PeriodicalIF":2.4,"publicationDate":"2025-06-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144502451","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
PrimeSpecPCR: Python toolkit for species-specific DNA primer design and specificity testing primspecpcr:用于物种特异性DNA引物设计和特异性测试的Python工具包
IF 2.4 4区 计算机科学
SoftwareX Pub Date : 2025-06-28 DOI: 10.1016/j.softx.2025.102249
Adam Kuzdraliński
{"title":"PrimeSpecPCR: Python toolkit for species-specific DNA primer design and specificity testing","authors":"Adam Kuzdraliński","doi":"10.1016/j.softx.2025.102249","DOIUrl":"10.1016/j.softx.2025.102249","url":null,"abstract":"<div><div>PrimeSpecPCR is an open-source Python toolkit that automates the workflow of species-specific primer design (comprising forward primer, reverse primer, and probe) and validation. The software implements a modular architecture comprising four main components: (1) automated retrieval of genetic sequences from NCBI databases based on taxonomy identifiers; (2) multiple sequence alignment using MAFFT to generate consensus sequences; (3) thermodynamically optimized primer and probe design via Primer3-py; and (4) multi-tiered specificity testing against the NCBI GenBank database. The toolkit features a user-friendly graphical interface and customizable parameters for quantitative PCR (qPCR) applications. PrimeSpecPCR accelerates primer development through parallel processing, automatic caching of intermediate results, and production of interactive HTML reports that visualize specificity profiles across taxonomic groups, while minimizing human errors and ensuring reproducibility of results. This toolkit reduces the time-intensive, labour-demanding processes conventionally required for designing species-specific molecular assays.</div></div>","PeriodicalId":21905,"journal":{"name":"SoftwareX","volume":"31 ","pages":"Article 102249"},"PeriodicalIF":2.4,"publicationDate":"2025-06-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144511037","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
GEMM-ArchProfiler: A simulation framework for hardware-level profiling and performance analysis of General Matrix Multiplication in real CNN workloads on heterogeneous CPU architectures GEMM-ArchProfiler:一个模拟框架,用于在异构CPU架构上的真实CNN工作负载中进行通用矩阵乘法的硬件级分析和性能分析
IF 2.4 4区 计算机科学
SoftwareX Pub Date : 2025-06-28 DOI: 10.1016/j.softx.2025.102243
Binu Ayyappan , G. Santhosh Kumar
{"title":"GEMM-ArchProfiler: A simulation framework for hardware-level profiling and performance analysis of General Matrix Multiplication in real CNN workloads on heterogeneous CPU architectures","authors":"Binu Ayyappan ,&nbsp;G. Santhosh Kumar","doi":"10.1016/j.softx.2025.102243","DOIUrl":"10.1016/j.softx.2025.102243","url":null,"abstract":"<div><div>In this paper, the authors present GEMM-ArchProfiler, a simulation framework for evaluating General Matrix Multiplication performance in convolutional neural networks. Targeted at resource-constrained edge and IoT systems, which rely on CPU-based architectures, the framework addresses hardware limitations through optimized workload profiling. Powered by the gem5 simulator, GEMM-ArchProfiler provides insights into memory usage, cache behavior, execution latency, and energy consumption. It integrates customized Darknet libraries to simulate realistic CNN workloads and includes a user-friendly CPU configuration mechanism and event analysis script. This tool bridges workload analysis and deployment, aiding efficient AI implementation on diverse CPU architectures.</div></div>","PeriodicalId":21905,"journal":{"name":"SoftwareX","volume":"31 ","pages":"Article 102243"},"PeriodicalIF":2.4,"publicationDate":"2025-06-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144502452","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Vulnerable Neighborhood Explorer (VNE): An Open-Source Visual Analytics Tool for Exploring Social Vulnerability to Disasters across Different Neighborhoods 脆弱社区浏览器(VNE):一个开源的可视化分析工具,用于探索不同社区的社会脆弱性
IF 2.4 4区 计算机科学
SoftwareX Pub Date : 2025-06-28 DOI: 10.1016/j.softx.2025.102233
Su Yeon Han , Jooyoung Yoo , Alexander Michels , Jeon-Young Kang , Shaowen Wang , Joon-Seok Kim
{"title":"Vulnerable Neighborhood Explorer (VNE): An Open-Source Visual Analytics Tool for Exploring Social Vulnerability to Disasters across Different Neighborhoods","authors":"Su Yeon Han ,&nbsp;Jooyoung Yoo ,&nbsp;Alexander Michels ,&nbsp;Jeon-Young Kang ,&nbsp;Shaowen Wang ,&nbsp;Joon-Seok Kim","doi":"10.1016/j.softx.2025.102233","DOIUrl":"10.1016/j.softx.2025.102233","url":null,"abstract":"<div><div>We developed the Vulnerable Neighborhood Explorer (VNE), a geovisual analytics tool that helps decision-makers and researchers identify disaster-prone neighborhoods using socioeconomic and demographic data. By creating neighborhood boundaries from user-provided data, VNE enables analysis of disparities in disaster impacts (e.g., casualties, injuries, infections) and their socioeconomic drivers. It employs clustering algorithms and a dynamic interface with Coordinated and Multiple Views (CMV), linking maps and charts interactively. Features like cross-filtering and brushing instantly update visualizations, allowing seamless exploration of vulnerable neighborhoods and their population characteristics in specific disaster contexts.</div></div>","PeriodicalId":21905,"journal":{"name":"SoftwareX","volume":"31 ","pages":"Article 102233"},"PeriodicalIF":2.4,"publicationDate":"2025-06-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144502450","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信