Realization of multimodal geo-tag using ARM-A53 with python

S. A. Abbas, S. K. Ajin, P. Sundar
{"title":"Realization of multimodal geo-tag using ARM-A53 with python","authors":"S. A. Abbas, S. K. Ajin, P. Sundar","doi":"10.1109/I2C2.2017.8321905","DOIUrl":null,"url":null,"abstract":"The development of Web and GPS devices is becoming popular in our daily life. Location-based services are rapidly increasing in the online world. The main principle behind these services is the development of a very personalized experience. Social-media websites allow queries for results originating at a certain location. Enabling new technologies with location information will be attractive to many businesses. The task of estimating the geo-coordinates of a media-recording is known as geo-tagging. Geo-tag is most commonly used for photographs and can help people get a more information about where the pictures was taken or the exact location of a friend who logged on to a service this information helps to develop a better semantic understanding of multimedia content. Therefore, for reliable geotags, this project suggests adopting a multimodal geotagging. This develop tourist destinations and newer applications Thus the multimodal information such as voice, latitude, longitude, temperature, humidity of different location is measured and stored in the database. This helps the user to have a better understanding of the location.","PeriodicalId":288351,"journal":{"name":"2017 International Conference on Intelligent Computing and Control (I2C2)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 International Conference on Intelligent Computing and Control (I2C2)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/I2C2.2017.8321905","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

The development of Web and GPS devices is becoming popular in our daily life. Location-based services are rapidly increasing in the online world. The main principle behind these services is the development of a very personalized experience. Social-media websites allow queries for results originating at a certain location. Enabling new technologies with location information will be attractive to many businesses. The task of estimating the geo-coordinates of a media-recording is known as geo-tagging. Geo-tag is most commonly used for photographs and can help people get a more information about where the pictures was taken or the exact location of a friend who logged on to a service this information helps to develop a better semantic understanding of multimedia content. Therefore, for reliable geotags, this project suggests adopting a multimodal geotagging. This develop tourist destinations and newer applications Thus the multimodal information such as voice, latitude, longitude, temperature, humidity of different location is measured and stored in the database. This helps the user to have a better understanding of the location.
基于ARM-A53的多模态地理标签的python实现
网络和GPS设备的发展在我们的日常生活中越来越受欢迎。基于位置的服务在网络世界中迅速增长。这些服务背后的主要原则是开发非常个性化的体验。社交媒体网站允许查询来自特定地点的结果。启用具有位置信息的新技术将对许多企业具有吸引力。估计媒体记录的地理坐标的任务被称为地理标记。地理标记最常用于照片,可以帮助人们获得更多关于照片拍摄地点的信息,或者登录某个服务的朋友的确切位置,这些信息有助于对多媒体内容进行更好的语义理解。因此,为了获得可靠的地理标签,本项目建议采用多模态地理标签。从而开发旅游目的地和更新的应用程序,从而测量不同位置的语音,纬度,经度,温度,湿度等多模式信息并存储在数据库中。这有助于用户更好地了解位置。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信