{"title":"启发式评价与指导性评价:两种领域可用性专家评价方法的比较","authors":"Sehrish Nizamani;Saad Nizamani;Nazish Basir;Gulsher Laghari;Khalil Khoumbati;Sarwat Nizamani","doi":"10.1109/TPC.2022.3201732","DOIUrl":null,"url":null,"abstract":"<bold><i>Background:</i></b>\n The usability of university websites is important to ascertain that they serve their intended purpose. Their usability can be evaluated either by testing methods that rely on actual users or by inspection methods that rely on experts for evaluation. Heuristic evaluation and guideline reviews are two inspection methods of usability evaluation. A heuristic evaluation consists of a few general heuristics (rules), which are limited to checking general flaws in the design. A guideline review uses a much larger set of guidelines/suggestions that fit a specific business domain. \n<bold><i>Literature review:</i></b>\n Most of the literature has equated usability studies with testing methods and has given less focus to inspection methods. Moreover, those studies have examined usability in a general sense and not in domain- and culture-specific contexts. \n<bold><i>Research questions:</i></b>\n 1. Do domain- and culture-specific heuristic evaluation and guideline reviews work similarly in evaluating the usability of applications? 2. Which of these methods is better in terms of the nature of evaluation, time needed for evaluation, evaluation procedure, templates adopted, and evaluation results? 3. Which method is better in terms of thoroughness and reliability? \n<bold><i>Research methodology</i></b>\n: This study uses a comparative methodology. The two inspection methods—guideline reviews and heuristic evaluation—are compared in a domain- and the culture-specific context in terms of the nature, time required, approach, templates, and results. \n<bold><i>Results:</i></b>\n The results reflect that both methods identify similar usability issues; however, they differ in terms of the nature, time duration, evaluation procedure, templates, and results of the evaluation. \n<bold><i>Conclusion:</i></b>\n This study contributes by providing insights for practitioners and researchers about the choice of an evaluation method for domain- and culture-specific evaluation of university websites.","PeriodicalId":46950,"journal":{"name":"IEEE Transactions on Professional Communication","volume":null,"pages":null},"PeriodicalIF":1.6000,"publicationDate":"2022-10-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Heuristic Evaluation Versus Guideline Reviews: A Tale of Comparing Two Domain Usability Expert's Evaluation Methods\",\"authors\":\"Sehrish Nizamani;Saad Nizamani;Nazish Basir;Gulsher Laghari;Khalil Khoumbati;Sarwat Nizamani\",\"doi\":\"10.1109/TPC.2022.3201732\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<bold><i>Background:</i></b>\\n The usability of university websites is important to ascertain that they serve their intended purpose. Their usability can be evaluated either by testing methods that rely on actual users or by inspection methods that rely on experts for evaluation. Heuristic evaluation and guideline reviews are two inspection methods of usability evaluation. A heuristic evaluation consists of a few general heuristics (rules), which are limited to checking general flaws in the design. A guideline review uses a much larger set of guidelines/suggestions that fit a specific business domain. \\n<bold><i>Literature review:</i></b>\\n Most of the literature has equated usability studies with testing methods and has given less focus to inspection methods. Moreover, those studies have examined usability in a general sense and not in domain- and culture-specific contexts. \\n<bold><i>Research questions:</i></b>\\n 1. Do domain- and culture-specific heuristic evaluation and guideline reviews work similarly in evaluating the usability of applications? 2. Which of these methods is better in terms of the nature of evaluation, time needed for evaluation, evaluation procedure, templates adopted, and evaluation results? 3. Which method is better in terms of thoroughness and reliability? \\n<bold><i>Research methodology</i></b>\\n: This study uses a comparative methodology. The two inspection methods—guideline reviews and heuristic evaluation—are compared in a domain- and the culture-specific context in terms of the nature, time required, approach, templates, and results. \\n<bold><i>Results:</i></b>\\n The results reflect that both methods identify similar usability issues; however, they differ in terms of the nature, time duration, evaluation procedure, templates, and results of the evaluation. \\n<bold><i>Conclusion:</i></b>\\n This study contributes by providing insights for practitioners and researchers about the choice of an evaluation method for domain- and culture-specific evaluation of university websites.\",\"PeriodicalId\":46950,\"journal\":{\"name\":\"IEEE Transactions on Professional Communication\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":1.6000,\"publicationDate\":\"2022-10-05\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Transactions on Professional Communication\",\"FirstCategoryId\":\"98\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/9911696/\",\"RegionNum\":2,\"RegionCategory\":\"文学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"COMMUNICATION\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Professional Communication","FirstCategoryId":"98","ListUrlMain":"https://ieeexplore.ieee.org/document/9911696/","RegionNum":2,"RegionCategory":"文学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMMUNICATION","Score":null,"Total":0}
Heuristic Evaluation Versus Guideline Reviews: A Tale of Comparing Two Domain Usability Expert's Evaluation Methods
Background:
The usability of university websites is important to ascertain that they serve their intended purpose. Their usability can be evaluated either by testing methods that rely on actual users or by inspection methods that rely on experts for evaluation. Heuristic evaluation and guideline reviews are two inspection methods of usability evaluation. A heuristic evaluation consists of a few general heuristics (rules), which are limited to checking general flaws in the design. A guideline review uses a much larger set of guidelines/suggestions that fit a specific business domain.
Literature review:
Most of the literature has equated usability studies with testing methods and has given less focus to inspection methods. Moreover, those studies have examined usability in a general sense and not in domain- and culture-specific contexts.
Research questions:
1. Do domain- and culture-specific heuristic evaluation and guideline reviews work similarly in evaluating the usability of applications? 2. Which of these methods is better in terms of the nature of evaluation, time needed for evaluation, evaluation procedure, templates adopted, and evaluation results? 3. Which method is better in terms of thoroughness and reliability?
Research methodology
: This study uses a comparative methodology. The two inspection methods—guideline reviews and heuristic evaluation—are compared in a domain- and the culture-specific context in terms of the nature, time required, approach, templates, and results.
Results:
The results reflect that both methods identify similar usability issues; however, they differ in terms of the nature, time duration, evaluation procedure, templates, and results of the evaluation.
Conclusion:
This study contributes by providing insights for practitioners and researchers about the choice of an evaluation method for domain- and culture-specific evaluation of university websites.
期刊介绍:
The IEEE Transactions on Professional Communication is a peer-reviewed journal devoted to applied research on professional communication—including but not limited to technical and business communication. Papers should address the research interests and needs of technical communicators, engineers, scientists, information designers, editors, linguists, translators, managers, business professionals, and others from around the globe who practice, conduct research on, and teach others about effective professional communication. The Transactions publishes original, empirical research that addresses one of these contexts: The communication practices of technical professionals, such as engineers and scientists The practices of professional communicators who work in technical or business environments Evidence-based methods for teaching and practicing professional and technical communication.