Atiya Nova, S. Sansalone, R. Robinson, Pejman Mirza-Babaei
{"title":"用GUR绘制未知地图:AI测试如何补充专家评估","authors":"Atiya Nova, S. Sansalone, R. Robinson, Pejman Mirza-Babaei","doi":"10.1145/3555858.3555880","DOIUrl":null,"url":null,"abstract":"Despite the advantages of using expert evaluation as a method within games user research (GUR) (i.e. provides stakeholders low cost, rapid feedback), it does not always accurately reflect the general player’s experience. Testing the game out with real users (also called playtesting) helps bridge this gap by giving game developers an in-depth look into the player experience. However, playtesting is resource intensive and time consuming, making it difficult to implement within the tight time frames of industry game development. AI can help to mitigate some of these issues by providing an automated way to simulate player behaviour and experience. In this paper, we introduce a tool called PathOS+—a playtesting interface which uses AI playtesting data to help enhance expert evaluation. Results from a study conducted with expert participants shows how PathOS+ could contribute to game design and assist developers and researchers in conducting expert evaluations. This is an important contribution as it provides game user researchers and designers with a fast, low-cost and effective game evaluation approach which has the potential to make game evaluation more accessible to indie and smaller game studios.","PeriodicalId":290159,"journal":{"name":"Proceedings of the 17th International Conference on the Foundations of Digital Games","volume":"50 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-09-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Charting the Uncharted with GUR: How AI Playtesting Can Supplement Expert Evaluation\",\"authors\":\"Atiya Nova, S. Sansalone, R. Robinson, Pejman Mirza-Babaei\",\"doi\":\"10.1145/3555858.3555880\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Despite the advantages of using expert evaluation as a method within games user research (GUR) (i.e. provides stakeholders low cost, rapid feedback), it does not always accurately reflect the general player’s experience. Testing the game out with real users (also called playtesting) helps bridge this gap by giving game developers an in-depth look into the player experience. However, playtesting is resource intensive and time consuming, making it difficult to implement within the tight time frames of industry game development. AI can help to mitigate some of these issues by providing an automated way to simulate player behaviour and experience. In this paper, we introduce a tool called PathOS+—a playtesting interface which uses AI playtesting data to help enhance expert evaluation. Results from a study conducted with expert participants shows how PathOS+ could contribute to game design and assist developers and researchers in conducting expert evaluations. This is an important contribution as it provides game user researchers and designers with a fast, low-cost and effective game evaluation approach which has the potential to make game evaluation more accessible to indie and smaller game studios.\",\"PeriodicalId\":290159,\"journal\":{\"name\":\"Proceedings of the 17th International Conference on the Foundations of Digital Games\",\"volume\":\"50 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-09-05\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 17th International Conference on the Foundations of Digital Games\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3555858.3555880\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 17th International Conference on the Foundations of Digital Games","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3555858.3555880","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Charting the Uncharted with GUR: How AI Playtesting Can Supplement Expert Evaluation
Despite the advantages of using expert evaluation as a method within games user research (GUR) (i.e. provides stakeholders low cost, rapid feedback), it does not always accurately reflect the general player’s experience. Testing the game out with real users (also called playtesting) helps bridge this gap by giving game developers an in-depth look into the player experience. However, playtesting is resource intensive and time consuming, making it difficult to implement within the tight time frames of industry game development. AI can help to mitigate some of these issues by providing an automated way to simulate player behaviour and experience. In this paper, we introduce a tool called PathOS+—a playtesting interface which uses AI playtesting data to help enhance expert evaluation. Results from a study conducted with expert participants shows how PathOS+ could contribute to game design and assist developers and researchers in conducting expert evaluations. This is an important contribution as it provides game user researchers and designers with a fast, low-cost and effective game evaluation approach which has the potential to make game evaluation more accessible to indie and smaller game studios.