B. Nardi, Heinrich Schwarz, A. Kuchinsky, R. Leichner, S. Whittaker, R. Sclabassi
{"title":"Turning away from talking heads: the use of video-as-data in neurosurgery","authors":"B. Nardi, Heinrich Schwarz, A. Kuchinsky, R. Leichner, S. Whittaker, R. Sclabassi","doi":"10.1145/169059.169261","DOIUrl":"https://doi.org/10.1145/169059.169261","url":null,"abstract":"Studies of video as a support for collaborative work have provided little hard evidence of its utility for either task performance or fostering telepresence, i.e. the conveyance of a face-to-face like social presence for remotely located participants. To date, most research on the value of video has concentrated on “talking heads” video in which the video images are of remote participants conferring or performing some task together. In contrast to talking heads video, we studied video-as-data in which video images of the workspace and work objects are the focus of interest, and convey critical information about the work. The use of video-as-data is intended to enhance task performance, rather than to provide telepresence. We studied the use of video during neurosurgery within the operating room and at remote locations away from the operating room. The workspace shown in the video is the surgical field (brain or spine) that the surgeon is operating on. We discuss our findings on the use of live and recorded video, and suggest extensions to video-as-data including its integration with computerized time-based information sources to educate and co-ordinate complex actions among distributed workgroups.","PeriodicalId":407219,"journal":{"name":"Proceedings of the INTERACT '93 and CHI '93 Conference on Human Factors in Computing Systems","volume":"8 6","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1993-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114039616","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"MicroCentre, Dundee: ordinary and extra-ordinary HCI research","authors":"A. Newell","doi":"10.1145/169059.169191","DOIUrl":"https://doi.org/10.1145/169059.169191","url":null,"abstract":"The unit has an annual research income of approximately E350,000, and a staff of over twenty researchers led by computer and human factors engineers. The group contains a unique and stimtdating blend of disciplines including computer scientists and engineers as well as psychologists, therapists, school teachers, a linguist, a philosopher and researchers with social work training. It collaborates closely with the University Medical School and Social Work Department as well as the local Education and Therapy Services. User evahtation and trials are conducted in the laboratory, and in local schools, clinics, and residential and private homes.","PeriodicalId":407219,"journal":{"name":"Proceedings of the INTERACT '93 and CHI '93 Conference on Human Factors in Computing Systems","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1993-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128792592","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Software for the usability lab: a sampling of current tools","authors":"P. Weiler","doi":"10.1145/169059.169076","DOIUrl":"https://doi.org/10.1145/169059.169076","url":null,"abstract":"This panel brings together usability professionals throughout the computer industry to demonstrate and discuss their usability lab software tools. These tools are specifically designed to improve the data collection and analysis process for usability labs. Their capabilities range from simple to complex and the panel will not only discuss the benefits of using the tools but also share the lessons learned during the design and development process.","PeriodicalId":407219,"journal":{"name":"Proceedings of the INTERACT '93 and CHI '93 Conference on Human Factors in Computing Systems","volume":"10 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1993-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114305220","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Common elements in today's graphical user interfaces: the good, the bad, and the ugly","authors":"A. Farrand","doi":"10.1145/169059.169407","DOIUrl":"https://doi.org/10.1145/169059.169407","url":null,"abstract":"This panel will identify some of the similarities amongst the different familiar graphical user interfaces that make them seem so indistinguishable. This panel will then identify some of the similarities that don't belong in any modern user interface.","PeriodicalId":407219,"journal":{"name":"Proceedings of the INTERACT '93 and CHI '93 Conference on Human Factors in Computing Systems","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1993-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121436013","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A space based model for user interaction in shared synthetic environments","authors":"L. Fahlén, C. G. Brown, Olov Ståhl, C. Carlsson","doi":"10.1145/169059.169068","DOIUrl":"https://doi.org/10.1145/169059.169068","url":null,"abstract":"In a distributed shared synthetic environment with provisions for high quality 3D visualization and interaction, it is possible to implement a powerful variant of a rooms/space metaphor based on the concept of presence or proximity between participants in 3D space. This kind of model can be used as an interface between the user and the computer, for overview and control of applications, file systems, networks and other computer resources, as well as for communication and collaboration with other users in the networked environment. We model proximity with a geometric volume of the immediate surroundings, the aura, of the participant's representation in the synthetic environment. This proximity, or aura, is used to establish presence at meetings, to establish communication channels and to provide interaction.","PeriodicalId":407219,"journal":{"name":"Proceedings of the INTERACT '93 and CHI '93 Conference on Human Factors in Computing Systems","volume":"12 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1993-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124446665","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Beyond interface builders: model-based interface tools","authors":"Pedro A. Szekely, Ping Luo, R. Neches","doi":"10.1145/169059.169305","DOIUrl":"https://doi.org/10.1145/169059.169305","url":null,"abstract":"Interface builders only support the construction of the menus and dialogue boxes of an application. They do not support the construction of interfaces of many application classes (visualization, simulation, command and control, domain-specific editors) because of the dynamic and complex information that these applications process. HUMANOID is a model-based interface design and construction tool where interfaces are specifkd by building a declarative description (model) of their presentation and behavior. HUMANOID’S modeling language provides simple abstraction, iteration and conditional constructs to model the interface features of these application classes. HUMANOID provides an easy-touse designer’s interface that lets designers build complex interfaces without programming.","PeriodicalId":407219,"journal":{"name":"Proceedings of the INTERACT '93 and CHI '93 Conference on Human Factors in Computing Systems","volume":"4 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1993-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131351863","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Queries-R-Links: graphical markup for text navigation","authors":"G. Golovchinsky, M. Chignell","doi":"10.1145/169059.169372","DOIUrl":"https://doi.org/10.1145/169059.169372","url":null,"abstract":"In this paper we introduce a style of interaction (interactive querying) that combines features of hypertext with Boolean querying, using direct markup of text to launch queries. We describe two experiments that compare the relative ease of expressing Boolean queries as text versus a graphical equivalent. The results of these experiments show that the expression of queries in the graphical format is no more difficult than the textual equivalent. We then describe the Queries-R-Links system that we have developed at the University of Toronto. Queries-R-Links uses the graphical markup method to launch Boolean queries interactively using direct markup of text. This work represents significant progress towards information exploration systems that combine the useful features of information retrieval querying and hypertext browsing.","PeriodicalId":407219,"journal":{"name":"Proceedings of the INTERACT '93 and CHI '93 Conference on Human Factors in Computing Systems","volume":"77 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1993-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115741608","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Computer aided conversation for severely physically impaired non-speaking people","authors":"N. Alm, J. Todman, Leona Elder, A. Newell","doi":"10.1145/169059.169187","DOIUrl":"https://doi.org/10.1145/169059.169187","url":null,"abstract":"This paper reports the development of a computer-aided conversation prosthesis which is designed for severely physically impaired non-speaking people. The research methodology was to model aspects of conversational structure derived from the field of conversation analysis within a prototype conversational prosthesis. The prototype was evaluated in empirical investigations which also suggested successful strategies for carrying out satisfying conversation using such a system. Two versions have been built and tested, one using an able-bodied operator to test the feasibility of creating conversation from prestored material, the second being used by a physically impaired non-speaking operator. The prototype demonstrated the advantages of this interface design in helping the user to carry out natural sounding and satisfying conversations.","PeriodicalId":407219,"journal":{"name":"Proceedings of the INTERACT '93 and CHI '93 Conference on Human Factors in Computing Systems","volume":"52 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1993-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114595470","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Human performance using computer input devices in the preferred and non-preferred hands","authors":"Paul Kabbash, I. MacKenzie, W. Buxton","doi":"10.1145/169059.169414","DOIUrl":"https://doi.org/10.1145/169059.169414","url":null,"abstract":"Subjects' performance was compared in pointing and dragging tasks using the preferred and non-preferred hands. Tasks were tested using three different input devices: a mouse, a trackball, and a tablet-with-stylus. The trackball had the least degradation across hands in performing the tasks, however it remained inferior to both the mouse and stylus. For small distances and small targets, the preferred hand was superior. However, for larger targets and larger distances, both hands performed about the same. The experiment shows that the non-preferred hand is more than a poor approximation of the preferred hand. The hands are complementary, each having its own strength and weakness. One design implication is that the non-preferred hand is well suited for tasks that do not require precise action, such as scrolling.","PeriodicalId":407219,"journal":{"name":"Proceedings of the INTERACT '93 and CHI '93 Conference on Human Factors in Computing Systems","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1993-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128198952","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Touch-typing with a stylus (abstract)","authors":"David Goldberg, Cate Richardson","doi":"10.1145/169059.169500","DOIUrl":"https://doi.org/10.1145/169059.169500","url":null,"abstract":"Keyboards are a vital part of today’s computers. Although keyboards are somewhat butky, they are well suited to PCs (even portable laptops) and workstations. In the future of Ubiquitous Computing [3], pocket-sized and wall-sized compu’krs will be common. A keyboard is not very suitable for these sizes of computers. Thus many manufacturers are providing electronic pens or styli (we use the two terms interchangeably) as the primary input device for computers. A stylus is attractive because it works very well over the entire range of sizes. However, it is not very convenient for text entry. The state of the art is to print characters, with hxed entry recommended to improve aecumcy [1]. This is slow and error prone [2]. This suggests that a major impe&ment to the widespread use of styli is the problem of finding a convenient way to enter text. There is an analogy betwtwn keyboards and styli. Keyboarda can be used with no training: the letters can be tapped out one-by-one using hunt-and-peck. This is similar to what is currently done with styli. No new training is required, and letters are printed one-by-one. However, unlike styli, keyboards have a “growth path.” Whh practice, hunt-and-pek with two fingers can become faster than handwriting. If even higher speeds are desired, then keyboard users can learn touch-typing. Touch-typing not only achieves high speeds, it also enables “eyes-free” operation, that is, the ability to type without having to look at your hands. his suggests that the solution to the problem of stylus text entry requires developing an analogue of touch-typing. Our approach to developing touch-typing for a stylus is based on introducing a special alphabet of unistrokes. Like touch-typing for keyboards, unistrokes have to be learned. Unistrokes have the following advantages over ordinary pMting: Permission to copy without fee all or part of this material is granted provided that the copies are not made or distributed for direct commercial advantage, the ACM copyright notice and the title of the publication and its date appear, and notice is givan that copying is by permission of the Association for Computing Machinery. To copy otherwise, or to republish, requires a fee and/or epecific permission. ; 1993 ACM 0.8979 j-~75-~/93/0004/0520 . ..$j .50 ● They are designed somewhat like error correcting codes. When written sloppily, they can still be distinguished from one another. ● Each unistroke is a single pen-down/pen-up motion, hence the name unistroke. Not only does this mean that recognition cannot have segmentation errors (that is, errors in determining which sets of strokes belong to a single multi-stroke letter), but it means that letters can unambiguously be written one on top of another. Thus unistrokes can he entered in a small box just big enough to hold one letter. ● The unistrokes associated with the most common letters (’e’, ‘a’, ‘t’, ‘i’, ‘r’) are all straight lines, and hence are fast to write. The unistroke design is bei","PeriodicalId":407219,"journal":{"name":"Proceedings of the INTERACT '93 and CHI '93 Conference on Human Factors in Computing Systems","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1993-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130147660","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}