2023 IEEE/ACM International Workshop on Equitable Data & Technology (FairWare)最新文献

筛选
英文 中文
On Retrofitting Provenance for Transparent and Fair Software - Drivers and Challenges 为透明和公平的软件改进来源——驱动因素和挑战
2023 IEEE/ACM International Workshop on Equitable Data & Technology (FairWare) Pub Date : 2023-05-01 DOI: 10.1109/FairWare59297.2023.00007
Jens Dietrich, M. Galster, Markus Luczak-Rösch
{"title":"On Retrofitting Provenance for Transparent and Fair Software - Drivers and Challenges","authors":"Jens Dietrich, M. Galster, Markus Luczak-Rösch","doi":"10.1109/FairWare59297.2023.00007","DOIUrl":"https://doi.org/10.1109/FairWare59297.2023.00007","url":null,"abstract":"There have been ongoing discussions about how to ensure transparency and fairness in software that utilise artificial intelligence (AI). However, transparency and fairness are not limited to AI. Modern (non-AI) software is often constructed in a black box fashion. This means, components and services provide some functionality, but details on how this is achieved are hidden. Common software development and design principles like encapsulation and information hiding promote this. Engineers often only look inside the black boxes when they need to fix problems, e.g., when tracing bugs or vulnerabilities. The demand for transparency has created a need to open those black boxes also to non-engineers. For instance, businesses need to demonstrate regulation compliance, and end users want to understand how systems make fair decisions that affect them. However, adding provenance (i.e., the ability to gather information about data and algorithms used in systems) to existing systems is invasive and costly, and current approaches to collect provenance data are not designed to expose data to end users. We argue that this requires “provenance retrofitting”, i.e., adding provenance capabilities to systems mechanically, and exposing provenance data through standard language and service application programming interfaces (APIs). This could facilitate an infrastructure that supports transparency, which then can in turn be used to create feedback mechanisms for users that in the long term can improve the fairness of software. In this paper we discuss drivers, objectives, key challenges and some possible approaches to provenance retrofitting.","PeriodicalId":169742,"journal":{"name":"2023 IEEE/ACM International Workshop on Equitable Data & Technology (FairWare)","volume":"69 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116299802","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Fair-Siamese Approach for Accurate Fairness in Image Classification 公平-暹罗方法在图像分类中的准确公平性
2023 IEEE/ACM International Workshop on Equitable Data & Technology (FairWare) Pub Date : 2023-05-01 DOI: 10.1109/FairWare59297.2023.00005
Kwanhyong Lee, Van-Thuan Pham, Jiayuan He
{"title":"Fair-Siamese Approach for Accurate Fairness in Image Classification","authors":"Kwanhyong Lee, Van-Thuan Pham, Jiayuan He","doi":"10.1109/FairWare59297.2023.00005","DOIUrl":"https://doi.org/10.1109/FairWare59297.2023.00005","url":null,"abstract":"Machine learning models are trained by iteratively fitting their parameters to the features of training data. These features may correlate to sensitive attributes such as race, age, or gender so they could introduce discrimination against minority groups. In a recent study, a fair Siamese network has been applied to discrete structured data under ‘accurate fairness’ constraints, showing promising results of improving fairness without sacrificing accuracy. However, the data augmentation strategy used in their paper cannot be applied to computer vision applications due to the reliance on a discrete perturbation method. In this paper, we adapt the structure of the fair Siamese approach for image classification and address the challenge of data augmentation using CycleGAN. We benchmark the performance of our approach in accuracy and fairness against the adversarial debiasing method. The results show that this adaptation of the fair Siamese approach outperform adversarial debiasing in accuracy and fairness for a variety of image classification tasks.","PeriodicalId":169742,"journal":{"name":"2023 IEEE/ACM International Workshop on Equitable Data & Technology (FairWare)","volume":"2 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115225197","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Reflexive Practices in Software Engineering 软件工程中的反思实践
2023 IEEE/ACM International Workshop on Equitable Data & Technology (FairWare) Pub Date : 2023-05-01 DOI: 10.1109/FairWare59297.2023.00006
Alicia E. Boyd
{"title":"Reflexive Practices in Software Engineering","authors":"Alicia E. Boyd","doi":"10.1109/FairWare59297.2023.00006","DOIUrl":"https://doi.org/10.1109/FairWare59297.2023.00006","url":null,"abstract":"Software plays a critical role in our daily lives, providing automated support for various tasks in various domains. Behind many of the decisions that modern software makes is a data-driven infrastructure that attempts to create equitable, unbiased decisions. However, numerous examples exist of data-driven software perpetuating societal inequities and further marginalizing populations. How do we attend to software fairness? What are the best approaches for software engineers to be more conscious of the harmful impacts on the most vulnerable within our communities? Prior work recommends new tools to resolve the unfair and biased outcomes; however, the issue of biased inequitable software technology is an interdisciplinary problem that we can no longer solely depend on technical solutions. Instead, we need to incorporate interdisciplinary methods to help address the inequity of software technology. This position paper introduces reflexivity from the social science literature to motivate and encourage software engineers to integrate reflexive practices throughout the entirety of the software engineering process.","PeriodicalId":169742,"journal":{"name":"2023 IEEE/ACM International Workshop on Equitable Data & Technology (FairWare)","volume":"4 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128459416","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信