基于专家决策策略的众包系统中网页可访问性评价

Shuyi Song, Jiajun Bu, Ye Wang, Zhi Yu, Andreas Artmeier, Lianjun Dai, Can Wang
{"title":"基于专家决策策略的众包系统中网页可访问性评价","authors":"Shuyi Song, Jiajun Bu, Ye Wang, Zhi Yu, Andreas Artmeier, Lianjun Dai, Can Wang","doi":"10.1145/3192714.3192827","DOIUrl":null,"url":null,"abstract":"The rising awareness of accessibility increases the demand for Web accessibility evaluation projects to verify the implementation of Web accessibility guidelines and identify accessibility barriers in websites. However, the complexity of accessibility evaluation tasks and the lack of experts limits their scope and reduces their significance. Due to this complexity, they could not directly rely on a technique called crowdsourcing, which made great contributions in many fields by dividing a problem into many tedious micro-tasks and solving tasks in parallel. Addressing this issue, we develop a new crowdsourcing-based Web accessibility evaluation system with two novel decision strategies, golden set strategy and time-based golden set strategy. These strategies enable the generation of task results with high accuracy synthesized from micro-tasks solved by workers with heterogeneous expertise. An accessibility evaluation of 98 websites by 55 workers with varying experience verifies that our system can complete the evaluation in half the time with a 7.2% improvement on accuracy than the current approach.","PeriodicalId":330095,"journal":{"name":"Proceedings of the Internet of Accessible Things","volume":"111 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-04-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"9","resultStr":"{\"title\":\"Web Accessibility Evaluation in a Crowdsourcing-Based System with Expertise-Based Decision Strategy\",\"authors\":\"Shuyi Song, Jiajun Bu, Ye Wang, Zhi Yu, Andreas Artmeier, Lianjun Dai, Can Wang\",\"doi\":\"10.1145/3192714.3192827\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The rising awareness of accessibility increases the demand for Web accessibility evaluation projects to verify the implementation of Web accessibility guidelines and identify accessibility barriers in websites. However, the complexity of accessibility evaluation tasks and the lack of experts limits their scope and reduces their significance. Due to this complexity, they could not directly rely on a technique called crowdsourcing, which made great contributions in many fields by dividing a problem into many tedious micro-tasks and solving tasks in parallel. Addressing this issue, we develop a new crowdsourcing-based Web accessibility evaluation system with two novel decision strategies, golden set strategy and time-based golden set strategy. These strategies enable the generation of task results with high accuracy synthesized from micro-tasks solved by workers with heterogeneous expertise. An accessibility evaluation of 98 websites by 55 workers with varying experience verifies that our system can complete the evaluation in half the time with a 7.2% improvement on accuracy than the current approach.\",\"PeriodicalId\":330095,\"journal\":{\"name\":\"Proceedings of the Internet of Accessible Things\",\"volume\":\"111 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2018-04-23\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"9\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the Internet of Accessible Things\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3192714.3192827\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the Internet of Accessible Things","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3192714.3192827","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 9

摘要

可访问性意识的提高增加了对Web可访问性评估项目的需求,以验证Web可访问性指南的实施并识别网站中的可访问性障碍。然而,可及性评价任务的复杂性和专家的缺乏限制了可及性评价的范围,降低了可及性评价的意义。由于这种复杂性,他们不能直接依赖一种叫做众包的技术,这种技术通过将一个问题分解成许多繁琐的微任务,并并行地解决任务,在许多领域做出了巨大贡献。针对这一问题,本文采用黄金集策略和基于时间的黄金集策略,开发了一种基于众包的Web可访问性评价系统。这些策略能够从具有不同专业知识的工作人员解决的微任务中生成高精度的任务结果。55名经验不同的工作人员对98个网站的可访问性进行了评估,结果表明我们的系统可以在一半的时间内完成评估,准确度比目前的方法提高了7.2%。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Web Accessibility Evaluation in a Crowdsourcing-Based System with Expertise-Based Decision Strategy
The rising awareness of accessibility increases the demand for Web accessibility evaluation projects to verify the implementation of Web accessibility guidelines and identify accessibility barriers in websites. However, the complexity of accessibility evaluation tasks and the lack of experts limits their scope and reduces their significance. Due to this complexity, they could not directly rely on a technique called crowdsourcing, which made great contributions in many fields by dividing a problem into many tedious micro-tasks and solving tasks in parallel. Addressing this issue, we develop a new crowdsourcing-based Web accessibility evaluation system with two novel decision strategies, golden set strategy and time-based golden set strategy. These strategies enable the generation of task results with high accuracy synthesized from micro-tasks solved by workers with heterogeneous expertise. An accessibility evaluation of 98 websites by 55 workers with varying experience verifies that our system can complete the evaluation in half the time with a 7.2% improvement on accuracy than the current approach.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信