Recipe recommendation and cooking instruction based on food material recognition

Saki Asahina, Nobuyuki Umezu
{"title":"Recipe recommendation and cooking instruction based on food material recognition","authors":"Saki Asahina, Nobuyuki Umezu","doi":"10.1117/12.2589081","DOIUrl":null,"url":null,"abstract":"As devices such as depth cameras and projectors become cheaper, various methods have been proposed to make people's work environment smarter with AR technologies. In a kitchen, there would be a wide variety of advantages in an intelligent work environment, because people have to manage so many things such as various ingredients, typology of procedures with many recipes, and complicated work environment by mixing materials and tools. In this research, we aim to support cooking by recommending recipes with an installed projector and camera in the kitchen and projecting an operation interface. Our system recognizes ingredients with a DNN-based method called Mask R-CNN. It also has gesture controls that offer users contactless operations based on a Kinect depth sensor. We present four popular recipes obtained from a famous site named the Rakuten recipe. Displayed information includes dish names, ingredients, cooking procedures, process photographs, and time to cook. A series of user experiments with 10 participants was conducted to evaluate the usability of gesture operation with Kinect of the proposed system. The distance between Kinect and the hands of the participants is 0.8 (m). Each participant is given one trial and uses gestures to select one of four recipes displayed on a iMac screen. We received high evaluations (average 4.2 to 4.5 on a 5-point scale) in the results of the experiment questionnaire. Future work includes integrating more functions into our system, such as estimating ingredient amount based on the areas of recognized materials with Mask R-CNN, and cooking process recognition.","PeriodicalId":295011,"journal":{"name":"International Conference on Quality Control by Artificial Vision","volume":"62 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-07-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Conference on Quality Control by Artificial Vision","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1117/12.2589081","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

As devices such as depth cameras and projectors become cheaper, various methods have been proposed to make people's work environment smarter with AR technologies. In a kitchen, there would be a wide variety of advantages in an intelligent work environment, because people have to manage so many things such as various ingredients, typology of procedures with many recipes, and complicated work environment by mixing materials and tools. In this research, we aim to support cooking by recommending recipes with an installed projector and camera in the kitchen and projecting an operation interface. Our system recognizes ingredients with a DNN-based method called Mask R-CNN. It also has gesture controls that offer users contactless operations based on a Kinect depth sensor. We present four popular recipes obtained from a famous site named the Rakuten recipe. Displayed information includes dish names, ingredients, cooking procedures, process photographs, and time to cook. A series of user experiments with 10 participants was conducted to evaluate the usability of gesture operation with Kinect of the proposed system. The distance between Kinect and the hands of the participants is 0.8 (m). Each participant is given one trial and uses gestures to select one of four recipes displayed on a iMac screen. We received high evaluations (average 4.2 to 4.5 on a 5-point scale) in the results of the experiment questionnaire. Future work includes integrating more functions into our system, such as estimating ingredient amount based on the areas of recognized materials with Mask R-CNN, and cooking process recognition.
基于食材识别的食谱推荐和烹饪指导
随着深度相机和投影仪等设备变得越来越便宜,人们提出了各种方法,通过AR技术使人们的工作环境变得更加智能。在厨房里,智能化的工作环境会有各种各样的优势,因为人们要管理很多东西,比如各种各样的食材,各种食谱的程序类型,以及混合材料和工具的复杂工作环境。在这项研究中,我们的目标是通过在厨房安装投影仪和摄像头并投影操作界面来推荐食谱来支持烹饪。我们的系统使用一种基于dnn的方法来识别成分,这种方法被称为Mask R-CNN。它也有手势控制,为用户提供基于Kinect深度传感器的非接触操作。我们将介绍从著名网站“乐天食谱”获得的四种流行食谱。显示的信息包括菜肴名称、配料、烹饪过程、过程照片和烹饪时间。我们对10名参与者进行了一系列的用户实验,以评估该系统在Kinect上的手势操作的可用性。参与者的手与Kinect之间的距离为0.8米。每个参与者都有一次尝试,并使用手势从iMac屏幕上显示的四种食谱中选择一种。我们在实验问卷的结果中获得了很高的评价(平均4.2到4.5分,满分为5分)。未来的工作包括将更多的功能集成到我们的系统中,例如使用Mask R-CNN基于识别材料的区域估计配料量,以及烹饪过程识别。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信