{"title":"Tangible and Embodied Interaction","authors":"","doi":"10.1145/3544564.3544570","DOIUrl":null,"url":null,"abstract":"typically provides the face and hands of computation into the realm of human experience. In the first decades of digital computation, both the input and output to computational systems were generally physical [Bush 1931, Hartree 1946, Ceruzzi 1981]. For example, with the 1940s ENIAC computer, data input and output were channeled via paper card or tape [Alt 1972], with programs and their parame ters expressed through thousands of cables, sockets, knobs, and switches [Wilkes 1967]. Influenced by pioneering systems such as SAGE [Astrahan et al. 1957] and Sketchpad [Sutherland 1964, Sutherland et al. 1969], the input and especially output of computational systems progressively shifted to virtual, screen-based forms. This transformation has yielded user experiences that can be seen as visually rich but sensorily impoverished, engaging a very limited fragment of our human abilities to sense and engage our world. For example, consider the “presence” of a given computational application. From a few meters’ distance, whether your friend is working on a laptop or tablet, smartphone or HMD—can you tell if she is bal ancing a spreadsheet, playing Solitaire, flying a drone, or hacking a genome? Are you sure? Compare this with many of our most basic human engagements. In cultures where meeting a person is often accompanied by a handshake, the handshake carries many cues. The firmness of grasp, duration, perhaps moisture, and (in some contexts) perhaps a glove, all modulated with complex personal, social, and Tangible and Embodied Interaction","PeriodicalId":378123,"journal":{"name":"Weaving Fire into Form","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Weaving Fire into Form","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3544564.3544570","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
typically provides the face and hands of computation into the realm of human experience. In the first decades of digital computation, both the input and output to computational systems were generally physical [Bush 1931, Hartree 1946, Ceruzzi 1981]. For example, with the 1940s ENIAC computer, data input and output were channeled via paper card or tape [Alt 1972], with programs and their parame ters expressed through thousands of cables, sockets, knobs, and switches [Wilkes 1967]. Influenced by pioneering systems such as SAGE [Astrahan et al. 1957] and Sketchpad [Sutherland 1964, Sutherland et al. 1969], the input and especially output of computational systems progressively shifted to virtual, screen-based forms. This transformation has yielded user experiences that can be seen as visually rich but sensorily impoverished, engaging a very limited fragment of our human abilities to sense and engage our world. For example, consider the “presence” of a given computational application. From a few meters’ distance, whether your friend is working on a laptop or tablet, smartphone or HMD—can you tell if she is bal ancing a spreadsheet, playing Solitaire, flying a drone, or hacking a genome? Are you sure? Compare this with many of our most basic human engagements. In cultures where meeting a person is often accompanied by a handshake, the handshake carries many cues. The firmness of grasp, duration, perhaps moisture, and (in some contexts) perhaps a glove, all modulated with complex personal, social, and Tangible and Embodied Interaction