A Feel for the Game: AI, Computer Games and Perceiving Perception

Mark A. Ouellette, S. Conway
{"title":"A Feel for the Game: AI, Computer Games and Perceiving Perception","authors":"Mark A. Ouellette, S. Conway","doi":"10.7557/23.6169","DOIUrl":null,"url":null,"abstract":"I walk into the room and the smell of burning wood hits me immediately. The warmth from the fireplace grows as I step nearer to it. The fire needs to heat the little cottage through night so I add a log to the fire. There are a few sparks and embers. I throw a bigger log onto the fire and it drops with a thud. Again, there are barely any sparks or embers. The heat and the smell stay the same. They don’t change and I do not become habituated to it. Rather, they are just a steady stream, so I take off my VR headset and give my recommendations to the team programming the gameified world of the virtual museum of the future (one depicting an ancient Turkish settlement, being built now at the institution where one of us works). As much as this technological world seems almost too futuristic, it actually retrieves obsolete items from the past—a heater, a piece of wood, and a spray bottle—in keeping with McLuhan’s (1973) insights regarding media that provide strong participation goals and the rubric for achieving them. Moreover, the VR world extends the progression of game AI that occasioned the love-hate relationship with the “walking sim.” The stronger the AI, the more clearly defined the rubric for participation. In the VR interactive museum the designers want people to be able to “play” with haptic devices—like the smell, smoke, and heat generators—in order to heighten not only the immersion but also the perception of being there, or what Bolter and Grusin (1999) call “immediacy.” Indeed, Bolter and Grusin argue that the need for immediacy overwhelmingly takes over, regardless of the media’s intrusion. However, in the example above, the system fell short because the designers had not figured for someone laying down the “log” on the virtual fire and having it send a representative—that is, a perceptual, based on experience, intuition, etc.—amount of sparks and heat. Someone else could throw the log as hard as they want. The machine only senses log in or log out. This corresponds precisely with how we feel about phenomena, for machines and AI are based upon a model of intelligence which prioritises mental representation and symbolic manipulation. For Laird and van Lent (2001), in their field defining presentation, the “killer app” of human-level AI was going to be computer games. Writing a decade later in the same conference proceedings, Weber, Mateas, and Jhala (2011), are still responding to this original position, by way of AI in strategy games. Writing for this year’s, IEEE meeting Petrović (2018) also makes the case for human-level AI in games. What becomes clear, then, is that as much as we have wanted games to offer human behaviours, perception has taken a backseat in the extant models.As phenomenology makes clear, the emphasis on behaviour over perception leaves out the crucial, indeed foundational mode of intelligence: affective intentionality. Simply put, how we feel about phenomena impacts how we perceive phenomena as significant, inconsequential, interesting, etc. Thus, we should be asking if machines can understand significance? Can they feel any particular way about a game, a move, or the phenomenon of play? This becomes important when mental representation provides the mode of symbolic manipulation and vice versa. This occurs in and through a given game or gameified world’s ability to instill, simulate, or otherwise produce affective intentionality. We would argue that herein lies the crux of the mixed reactions to Red Dead Redemption 2. Similarly, the example from the virtual museum highlights the ongoing omission. Human-level AI should not just reproduce a human’s response to inputs, but should produce responses that a human would perceive. In short human-level AI needs to perceive perception itself. Indeed, this is the primary cognitive and affective response. Phenomenology does not tell us that; first principles semiotics tells us that. However, phenomenology gives us the means and methods to understand the response to affective intentionality and, more importantly, to develop the contingent hermeneutic (Merleau-Ponty, 2013). Moreover, semiotics will never encompass the materiality required of such a system, let alone the simulated materiality that exists through the interaction with the AI device and its interface, a device that bears the mark of the maker, just as surely as a bespoke shirt does. Thus, our paper will consider the production of affective intentionality and the ways VR games and gameified systems, like the virtual museum and Red Dead Redemption 2, facilitate, impede, and especially teach the perception of perception. As a corollary, then, our paper necessarily considers meta-cognitive processes—that is, the strategies for learning about learning—that occur in and through interaction with AI in games and devices (cf Hacker, 1998, 2016). Indeed, meta-cognition becomes a contingent component for instilling affective intentionality.","PeriodicalId":247562,"journal":{"name":"Eludamos: Journal for Computer Game Culture","volume":"23 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-04-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Eludamos: Journal for Computer Game Culture","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.7557/23.6169","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2

Abstract

I walk into the room and the smell of burning wood hits me immediately. The warmth from the fireplace grows as I step nearer to it. The fire needs to heat the little cottage through night so I add a log to the fire. There are a few sparks and embers. I throw a bigger log onto the fire and it drops with a thud. Again, there are barely any sparks or embers. The heat and the smell stay the same. They don’t change and I do not become habituated to it. Rather, they are just a steady stream, so I take off my VR headset and give my recommendations to the team programming the gameified world of the virtual museum of the future (one depicting an ancient Turkish settlement, being built now at the institution where one of us works). As much as this technological world seems almost too futuristic, it actually retrieves obsolete items from the past—a heater, a piece of wood, and a spray bottle—in keeping with McLuhan’s (1973) insights regarding media that provide strong participation goals and the rubric for achieving them. Moreover, the VR world extends the progression of game AI that occasioned the love-hate relationship with the “walking sim.” The stronger the AI, the more clearly defined the rubric for participation. In the VR interactive museum the designers want people to be able to “play” with haptic devices—like the smell, smoke, and heat generators—in order to heighten not only the immersion but also the perception of being there, or what Bolter and Grusin (1999) call “immediacy.” Indeed, Bolter and Grusin argue that the need for immediacy overwhelmingly takes over, regardless of the media’s intrusion. However, in the example above, the system fell short because the designers had not figured for someone laying down the “log” on the virtual fire and having it send a representative—that is, a perceptual, based on experience, intuition, etc.—amount of sparks and heat. Someone else could throw the log as hard as they want. The machine only senses log in or log out. This corresponds precisely with how we feel about phenomena, for machines and AI are based upon a model of intelligence which prioritises mental representation and symbolic manipulation. For Laird and van Lent (2001), in their field defining presentation, the “killer app” of human-level AI was going to be computer games. Writing a decade later in the same conference proceedings, Weber, Mateas, and Jhala (2011), are still responding to this original position, by way of AI in strategy games. Writing for this year’s, IEEE meeting Petrović (2018) also makes the case for human-level AI in games. What becomes clear, then, is that as much as we have wanted games to offer human behaviours, perception has taken a backseat in the extant models.As phenomenology makes clear, the emphasis on behaviour over perception leaves out the crucial, indeed foundational mode of intelligence: affective intentionality. Simply put, how we feel about phenomena impacts how we perceive phenomena as significant, inconsequential, interesting, etc. Thus, we should be asking if machines can understand significance? Can they feel any particular way about a game, a move, or the phenomenon of play? This becomes important when mental representation provides the mode of symbolic manipulation and vice versa. This occurs in and through a given game or gameified world’s ability to instill, simulate, or otherwise produce affective intentionality. We would argue that herein lies the crux of the mixed reactions to Red Dead Redemption 2. Similarly, the example from the virtual museum highlights the ongoing omission. Human-level AI should not just reproduce a human’s response to inputs, but should produce responses that a human would perceive. In short human-level AI needs to perceive perception itself. Indeed, this is the primary cognitive and affective response. Phenomenology does not tell us that; first principles semiotics tells us that. However, phenomenology gives us the means and methods to understand the response to affective intentionality and, more importantly, to develop the contingent hermeneutic (Merleau-Ponty, 2013). Moreover, semiotics will never encompass the materiality required of such a system, let alone the simulated materiality that exists through the interaction with the AI device and its interface, a device that bears the mark of the maker, just as surely as a bespoke shirt does. Thus, our paper will consider the production of affective intentionality and the ways VR games and gameified systems, like the virtual museum and Red Dead Redemption 2, facilitate, impede, and especially teach the perception of perception. As a corollary, then, our paper necessarily considers meta-cognitive processes—that is, the strategies for learning about learning—that occur in and through interaction with AI in games and devices (cf Hacker, 1998, 2016). Indeed, meta-cognition becomes a contingent component for instilling affective intentionality.
游戏的感觉:AI、电脑游戏和感知感知
我走进房间,一股木头烧焦的味道立刻扑面而来。我越靠近壁炉,壁炉里的温暖就越强烈。炉火需要让小屋整夜都暖和起来,所以我往火里加了一根木柴。有一些火花和余烬。我把一根更大的圆木扔进火里,它砰的一声掉了下去。同样,几乎没有火花或余烬。热量和气味保持不变。他们没有改变,我也没有习惯。相反,它们只是一个稳定的流,所以我摘下我的VR耳机,向为未来虚拟博物馆的游戏化世界编程的团队提出我的建议(其中一个描绘了一个古老的土耳其定居点,现在正在我们其中一个工作的机构建造)。尽管这个技术世界似乎过于未来主义,但它实际上从过去回收了过时的物品——加热器、一块木头和一个喷雾瓶——这与麦克卢汉(1973)关于媒体的见解保持一致,这些媒体提供了强大的参与目标和实现目标的准则。此外,VR世界扩展了游戏AI的进程,引发了与“行走模拟”的爱恨关系。AI越强大,参与的规则就越明确。在虚拟现实互动博物馆中,设计师希望人们能够“玩”触觉设备——比如气味、烟雾和热量发生器——不仅可以提高沉浸感,还可以提高身临其境的感觉,或者Bolter和Grusin(1999)所说的“即时性”。事实上,博尔特和格鲁辛认为,对即时性的需求压倒性地占据了主导地位,而不管媒体的介入。然而,在上面的例子中,这个系统并不完善,因为设计师没有考虑到有人将“原木”放在虚拟火焰上,并让它发送一个代表——即基于经验、直觉等的感知——一定数量的火花和热量。别人想怎么扔木头都可以。这台机器只感应登录或注销。这与我们对现象的感受完全一致,因为机器和人工智能基于一种优先考虑心理表征和符号操作的智能模型。对于Laird和van Lent(2001)来说,在他们的领域定义演讲中,人类级别AI的“杀手级应用”将是电脑游戏。Weber, Mateas和Jhala(2011)在10年后的同一会议记录中仍然通过策略游戏中的AI来回应这一最初的立场。在今年的IEEE会议上,petrovic(2018)也提出了在游戏中使用人类级别的AI。很明显,尽管我们希望游戏能够提供人类行为,但在现有模型中,感知已经退居次要地位。正如现象学所表明的那样,强调行为而不是感知,忽略了智力的关键,甚至是基础模式:情感意向性。简单地说,我们对现象的感受会影响我们对现象的看法,如重要、无关紧要、有趣等。因此,我们应该问,机器是否能理解意义?他们是否能够对游戏、移动或游戏现象产生特殊的感受?当心理表征提供了符号操作模式时,这一点就变得很重要,反之亦然。这是通过特定游戏或游戏化世界灌输、模拟或产生情感意向性的能力而发生的。我们认为这就是《荒野大镖客:救赎2》的复杂反应的症结所在。同样,虚拟博物馆的例子也凸显了这种持续的疏漏。人类级别的AI不应该只是重现人类对输入的反应,还应该产生人类能够感知的反应。简而言之,人类级别的AI需要感知感知本身。事实上,这是主要的认知和情感反应。现象学没有告诉我们;第一原理符号学告诉我们。然而,现象学为我们提供了理解对情感意向性的反应的手段和方法,更重要的是,发展偶然解释学(梅洛-庞蒂,2013)。此外,符号学永远无法涵盖这样一个系统所需的物质性,更不用说通过与人工智能设备及其界面的交互而存在的模拟物质性了,这种设备带有制造商的标记,就像定制衬衫一样。因此,我们的论文将考虑情感意向性的产生,以及VR游戏和游戏化系统(如虚拟博物馆和《荒野大镖客2》)促进、阻碍,尤其是传授感知感知的方式。因此,作为推论,我们的论文必然考虑元认知过程——即关于学习的学习策略——发生在游戏和设备中的AI交互中并通过交互发生(参见Hacker, 1998,2016)。事实上,元认知成为灌输情感意向性的偶然组成部分。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
0.90
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信