{"title":"基于食材识别的食谱推荐和烹饪指导","authors":"Saki Asahina, Nobuyuki Umezu","doi":"10.1117/12.2589081","DOIUrl":null,"url":null,"abstract":"As devices such as depth cameras and projectors become cheaper, various methods have been proposed to make people's work environment smarter with AR technologies. In a kitchen, there would be a wide variety of advantages in an intelligent work environment, because people have to manage so many things such as various ingredients, typology of procedures with many recipes, and complicated work environment by mixing materials and tools. In this research, we aim to support cooking by recommending recipes with an installed projector and camera in the kitchen and projecting an operation interface. Our system recognizes ingredients with a DNN-based method called Mask R-CNN. It also has gesture controls that offer users contactless operations based on a Kinect depth sensor. We present four popular recipes obtained from a famous site named the Rakuten recipe. Displayed information includes dish names, ingredients, cooking procedures, process photographs, and time to cook. A series of user experiments with 10 participants was conducted to evaluate the usability of gesture operation with Kinect of the proposed system. The distance between Kinect and the hands of the participants is 0.8 (m). Each participant is given one trial and uses gestures to select one of four recipes displayed on a iMac screen. We received high evaluations (average 4.2 to 4.5 on a 5-point scale) in the results of the experiment questionnaire. Future work includes integrating more functions into our system, such as estimating ingredient amount based on the areas of recognized materials with Mask R-CNN, and cooking process recognition.","PeriodicalId":295011,"journal":{"name":"International Conference on Quality Control by Artificial Vision","volume":"62 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-07-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Recipe recommendation and cooking instruction based on food material recognition\",\"authors\":\"Saki Asahina, Nobuyuki Umezu\",\"doi\":\"10.1117/12.2589081\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"As devices such as depth cameras and projectors become cheaper, various methods have been proposed to make people's work environment smarter with AR technologies. In a kitchen, there would be a wide variety of advantages in an intelligent work environment, because people have to manage so many things such as various ingredients, typology of procedures with many recipes, and complicated work environment by mixing materials and tools. In this research, we aim to support cooking by recommending recipes with an installed projector and camera in the kitchen and projecting an operation interface. Our system recognizes ingredients with a DNN-based method called Mask R-CNN. It also has gesture controls that offer users contactless operations based on a Kinect depth sensor. We present four popular recipes obtained from a famous site named the Rakuten recipe. Displayed information includes dish names, ingredients, cooking procedures, process photographs, and time to cook. A series of user experiments with 10 participants was conducted to evaluate the usability of gesture operation with Kinect of the proposed system. The distance between Kinect and the hands of the participants is 0.8 (m). Each participant is given one trial and uses gestures to select one of four recipes displayed on a iMac screen. We received high evaluations (average 4.2 to 4.5 on a 5-point scale) in the results of the experiment questionnaire. Future work includes integrating more functions into our system, such as estimating ingredient amount based on the areas of recognized materials with Mask R-CNN, and cooking process recognition.\",\"PeriodicalId\":295011,\"journal\":{\"name\":\"International Conference on Quality Control by Artificial Vision\",\"volume\":\"62 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-07-16\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"International Conference on Quality Control by Artificial Vision\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1117/12.2589081\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Conference on Quality Control by Artificial Vision","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1117/12.2589081","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Recipe recommendation and cooking instruction based on food material recognition
As devices such as depth cameras and projectors become cheaper, various methods have been proposed to make people's work environment smarter with AR technologies. In a kitchen, there would be a wide variety of advantages in an intelligent work environment, because people have to manage so many things such as various ingredients, typology of procedures with many recipes, and complicated work environment by mixing materials and tools. In this research, we aim to support cooking by recommending recipes with an installed projector and camera in the kitchen and projecting an operation interface. Our system recognizes ingredients with a DNN-based method called Mask R-CNN. It also has gesture controls that offer users contactless operations based on a Kinect depth sensor. We present four popular recipes obtained from a famous site named the Rakuten recipe. Displayed information includes dish names, ingredients, cooking procedures, process photographs, and time to cook. A series of user experiments with 10 participants was conducted to evaluate the usability of gesture operation with Kinect of the proposed system. The distance between Kinect and the hands of the participants is 0.8 (m). Each participant is given one trial and uses gestures to select one of four recipes displayed on a iMac screen. We received high evaluations (average 4.2 to 4.5 on a 5-point scale) in the results of the experiment questionnaire. Future work includes integrating more functions into our system, such as estimating ingredient amount based on the areas of recognized materials with Mask R-CNN, and cooking process recognition.