Towards Engineering Fair Ontologies: Unbiasing a Surveillance Ontology

Evangelos Paparidis, Konstantinos I. Kotis
{"title":"Towards Engineering Fair Ontologies: Unbiasing a Surveillance Ontology","authors":"Evangelos Paparidis, Konstantinos I. Kotis","doi":"10.1109/PIC53636.2021.9687030","DOIUrl":null,"url":null,"abstract":"Capturing knowledge in ontology-based AI applications may significantly propagate technical/statistical, cultural/social, cognitive/psychological, or other types of bias, to un-fair AI models and to their generated decisions. Biased ontologies (and consequently, knowledge graphs) engineered for intelligent surveillance applications can introduce technical barriers in fair capture of offenders, thus it must be researched as a first priority problem and a constant concern for explicit actions to be taken in the era of a more secure and fair world. In this paper we report preliminary research conducted on the novel topic of engineering fair ontologies and present first experiments with a prototype ontology and knowledge graph in the surveillance domain. Engineering fair ontologies is a quite new research topic, thus, the related work is at early stages. Having said that, in this paper we already highlight a recommended methodological approach for unbiasing ontologies, demonstrated in the surveillance domain, and we identify specific key research issues and challenges for further investigation by the ontology engineering community.","PeriodicalId":297239,"journal":{"name":"2021 IEEE International Conference on Progress in Informatics and Computing (PIC)","volume":"51 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-12-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 IEEE International Conference on Progress in Informatics and Computing (PIC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/PIC53636.2021.9687030","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2

Abstract

Capturing knowledge in ontology-based AI applications may significantly propagate technical/statistical, cultural/social, cognitive/psychological, or other types of bias, to un-fair AI models and to their generated decisions. Biased ontologies (and consequently, knowledge graphs) engineered for intelligent surveillance applications can introduce technical barriers in fair capture of offenders, thus it must be researched as a first priority problem and a constant concern for explicit actions to be taken in the era of a more secure and fair world. In this paper we report preliminary research conducted on the novel topic of engineering fair ontologies and present first experiments with a prototype ontology and knowledge graph in the surveillance domain. Engineering fair ontologies is a quite new research topic, thus, the related work is at early stages. Having said that, in this paper we already highlight a recommended methodological approach for unbiasing ontologies, demonstrated in the surveillance domain, and we identify specific key research issues and challenges for further investigation by the ontology engineering community.
走向工程公平本体:无偏见的监视本体
在基于本体的人工智能应用中获取知识可能会显著地传播技术/统计、文化/社会、认知/心理或其他类型的偏见,从而导致不公平的人工智能模型及其生成的决策。为智能监控应用而设计的有偏见的本体论(以及知识图谱)可能会在公平捕获罪犯方面引入技术障碍,因此必须将其作为首要问题进行研究,并在一个更安全和公平的世界时代采取明确的行动。在本文中,我们报告了对工程公平本体这一新课题的初步研究,并在监视领域中首次提出了原型本体和知识图的实验。工程公平本体是一个较新的研究课题,相关工作尚处于起步阶段。话虽如此,在本文中,我们已经强调了一种推荐的无偏见本体的方法方法,在监视领域得到了证明,我们确定了本体工程社区进一步研究的具体关键研究问题和挑战。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信