Deep signatures for indexing and retrieval in large motion databases

Yingying Wang, Michael Neff
{"title":"Deep signatures for indexing and retrieval in large motion databases","authors":"Yingying Wang, Michael Neff","doi":"10.1145/2822013.2822024","DOIUrl":null,"url":null,"abstract":"Data-driven motion research requires effective tools to compress, index, retrieve and reconstruct captured motion data. In this paper, we present a novel method to perform these tasks using a deep learning architecture. Our deep autoencoder, a form of artificial neural network, encodes motion segments into \"deep signatures\". This signature is formed by concatenating signatures for functionally different parts of the body. The deep signature is a highly condensed representation of a motion segment, requiring only 20 bytes, yet still encoding high level motion features. It can be used to produce a very compact representation of a motion database that can be effectively used for motion indexing and retrieval, with a very small memory footprint. Database searches are reduced to low cost binary comparisons of signatures. Motion reconstruction is achieved by fixing a \"deep signature\" that is missing a section using Gibbs Sampling. We tested both manually and automatically segmented motion databases and our experiments show that extracting the deep signature is fast and scales well with large databases. Given a query motion, similar motion segments can be retrieved at interactive speed with excellent match quality.","PeriodicalId":222258,"journal":{"name":"Proceedings of the 8th ACM SIGGRAPH Conference on Motion in Games","volume":"15 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2015-11-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"31","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 8th ACM SIGGRAPH Conference on Motion in Games","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/2822013.2822024","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 31

Abstract

Data-driven motion research requires effective tools to compress, index, retrieve and reconstruct captured motion data. In this paper, we present a novel method to perform these tasks using a deep learning architecture. Our deep autoencoder, a form of artificial neural network, encodes motion segments into "deep signatures". This signature is formed by concatenating signatures for functionally different parts of the body. The deep signature is a highly condensed representation of a motion segment, requiring only 20 bytes, yet still encoding high level motion features. It can be used to produce a very compact representation of a motion database that can be effectively used for motion indexing and retrieval, with a very small memory footprint. Database searches are reduced to low cost binary comparisons of signatures. Motion reconstruction is achieved by fixing a "deep signature" that is missing a section using Gibbs Sampling. We tested both manually and automatically segmented motion databases and our experiments show that extracting the deep signature is fast and scales well with large databases. Given a query motion, similar motion segments can be retrieved at interactive speed with excellent match quality.
用于大型运动数据库索引和检索的深度签名
数据驱动的运动研究需要有效的工具来压缩、索引、检索和重建捕获的运动数据。在本文中,我们提出了一种使用深度学习架构来执行这些任务的新方法。我们的深度自动编码器,一种人工神经网络形式,将运动片段编码成“深度签名”。这种签名是通过连接身体功能不同部位的签名而形成的。深度签名是运动片段的高度浓缩表示,只需要20字节,但仍然编码高级运动特征。它可以用来生成一个非常紧凑的运动数据库表示,可以有效地用于运动索引和检索,内存占用非常小。数据库搜索被简化为低成本的签名二进制比较。运动重建是通过使用吉布斯采样固定一个“深度签名”来实现的。我们测试了手动和自动分割的运动数据库,我们的实验表明,提取深度特征是快速的,并且可以很好地扩展到大型数据库。给定一个查询运动,可以以交互速度检索到相似的运动片段,并且匹配质量很好。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信