{"title":"Time-ART:在探索性分析的早期阶段对多媒体数据进行分割和注释的工具","authors":"Yasuhiro Yamamoto, Atsushi Aoki, K. Nakakoji","doi":"10.1145/634067.634136","DOIUrl":null,"url":null,"abstract":"Time-ART is a tool that helps a user in conducting empirical multimedia(video/sound) data analysis as an exploratory iterative process. Time-ART helps a user in (1) identifying seemingly interesting parts, (2) annotating them both textually and visually by positioning them in a 2D space, and (3) producing a summary report. The system consists of Movie/SoundEditor to segment a part of a movie/sound, ElementSpace, which is a free 2D space where a user can position segmented parts as objects, a TrackListController that synchronously plays multiple sound/video data, AnnotationEditor with which a user can textually annotate each positioned object, DocumentViewer that automatically compiles positioned parts and their annotations in the space, ViewFinder that provides a 3D view of ElementSpace allowing a user to use different \"depth\" as layers to classify positioned objects, and TimeChart that is another 3D view of ElementSpace helping a user understand the location of each segmented part in terms of the original movie/sound.","PeriodicalId":351792,"journal":{"name":"CHI '01 Extended Abstracts on Human Factors in Computing Systems","volume":"27 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2001-03-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"12","resultStr":"{\"title\":\"Time-ART: a tool for segmenting and annotating multimedia data in early stages of exploratory analysis\",\"authors\":\"Yasuhiro Yamamoto, Atsushi Aoki, K. Nakakoji\",\"doi\":\"10.1145/634067.634136\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Time-ART is a tool that helps a user in conducting empirical multimedia(video/sound) data analysis as an exploratory iterative process. Time-ART helps a user in (1) identifying seemingly interesting parts, (2) annotating them both textually and visually by positioning them in a 2D space, and (3) producing a summary report. The system consists of Movie/SoundEditor to segment a part of a movie/sound, ElementSpace, which is a free 2D space where a user can position segmented parts as objects, a TrackListController that synchronously plays multiple sound/video data, AnnotationEditor with which a user can textually annotate each positioned object, DocumentViewer that automatically compiles positioned parts and their annotations in the space, ViewFinder that provides a 3D view of ElementSpace allowing a user to use different \\\"depth\\\" as layers to classify positioned objects, and TimeChart that is another 3D view of ElementSpace helping a user understand the location of each segmented part in terms of the original movie/sound.\",\"PeriodicalId\":351792,\"journal\":{\"name\":\"CHI '01 Extended Abstracts on Human Factors in Computing Systems\",\"volume\":\"27 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2001-03-31\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"12\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"CHI '01 Extended Abstracts on Human Factors in Computing Systems\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/634067.634136\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"CHI '01 Extended Abstracts on Human Factors in Computing Systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/634067.634136","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Time-ART: a tool for segmenting and annotating multimedia data in early stages of exploratory analysis
Time-ART is a tool that helps a user in conducting empirical multimedia(video/sound) data analysis as an exploratory iterative process. Time-ART helps a user in (1) identifying seemingly interesting parts, (2) annotating them both textually and visually by positioning them in a 2D space, and (3) producing a summary report. The system consists of Movie/SoundEditor to segment a part of a movie/sound, ElementSpace, which is a free 2D space where a user can position segmented parts as objects, a TrackListController that synchronously plays multiple sound/video data, AnnotationEditor with which a user can textually annotate each positioned object, DocumentViewer that automatically compiles positioned parts and their annotations in the space, ViewFinder that provides a 3D view of ElementSpace allowing a user to use different "depth" as layers to classify positioned objects, and TimeChart that is another 3D view of ElementSpace helping a user understand the location of each segmented part in terms of the original movie/sound.