J. F. Ramirez Agudelo, Sebastian Bedoya Mazo, Sandra Lucia Posada Ochoa, Jaime Ricardo Rosero Noguera
{"title":"放牧系统中牛群活动的自动识别","authors":"J. F. Ramirez Agudelo, Sebastian Bedoya Mazo, Sandra Lucia Posada Ochoa, Jaime Ricardo Rosero Noguera","doi":"10.18684/rbsaa.v20.n2.2022.1940","DOIUrl":null,"url":null,"abstract":"The use of collars, pedometers or activity tags is expensive to record cattle's behavior in short periods (e.g. 24h). Under this particular situation, the development of low-cost and easy-to-use technologies is relevant. Similar to smartphone apps for human activity recognition, which analyzes data from embedded triaxial accelerometer sensors, we develop an Android app to record activity in cattle. Four main steps were followed: a) data acquisition for model training, b) model training, c) app deploy, and d) app utilization. For data acquisition, we developed a system in which three components were used: two smartphones and a Google Firebase account for data storage. For model training, the generated database was used to train a recurrent neural network. The performance of training was assessed by the confusion matrix. For all actual activities, the trained model provided a high prediction (> 96 %). The trained model was used to deploy an Android app by using the TensorFlow API. Finally, three cell phones (LG gm730) were used to test the app and record the activity of six Holstein cows (3 lactating and 3 non-lactating). Direct and non-systematic observations of the animals were made to contrast the activities recorded by the device. Our results show consistency between the direct observations and the activity recorded by our Android app.","PeriodicalId":56220,"journal":{"name":"Biotecnologia en el Sector Agropecuario y Agroindustrial","volume":"1 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2022-03-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Automatic cattle activity recognition on grazing systems\",\"authors\":\"J. F. Ramirez Agudelo, Sebastian Bedoya Mazo, Sandra Lucia Posada Ochoa, Jaime Ricardo Rosero Noguera\",\"doi\":\"10.18684/rbsaa.v20.n2.2022.1940\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The use of collars, pedometers or activity tags is expensive to record cattle's behavior in short periods (e.g. 24h). Under this particular situation, the development of low-cost and easy-to-use technologies is relevant. Similar to smartphone apps for human activity recognition, which analyzes data from embedded triaxial accelerometer sensors, we develop an Android app to record activity in cattle. Four main steps were followed: a) data acquisition for model training, b) model training, c) app deploy, and d) app utilization. For data acquisition, we developed a system in which three components were used: two smartphones and a Google Firebase account for data storage. For model training, the generated database was used to train a recurrent neural network. The performance of training was assessed by the confusion matrix. For all actual activities, the trained model provided a high prediction (> 96 %). The trained model was used to deploy an Android app by using the TensorFlow API. Finally, three cell phones (LG gm730) were used to test the app and record the activity of six Holstein cows (3 lactating and 3 non-lactating). Direct and non-systematic observations of the animals were made to contrast the activities recorded by the device. Our results show consistency between the direct observations and the activity recorded by our Android app.\",\"PeriodicalId\":56220,\"journal\":{\"name\":\"Biotecnologia en el Sector Agropecuario y Agroindustrial\",\"volume\":\"1 1\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-03-07\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Biotecnologia en el Sector Agropecuario y Agroindustrial\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.18684/rbsaa.v20.n2.2022.1940\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Biotecnologia en el Sector Agropecuario y Agroindustrial","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.18684/rbsaa.v20.n2.2022.1940","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Automatic cattle activity recognition on grazing systems
The use of collars, pedometers or activity tags is expensive to record cattle's behavior in short periods (e.g. 24h). Under this particular situation, the development of low-cost and easy-to-use technologies is relevant. Similar to smartphone apps for human activity recognition, which analyzes data from embedded triaxial accelerometer sensors, we develop an Android app to record activity in cattle. Four main steps were followed: a) data acquisition for model training, b) model training, c) app deploy, and d) app utilization. For data acquisition, we developed a system in which three components were used: two smartphones and a Google Firebase account for data storage. For model training, the generated database was used to train a recurrent neural network. The performance of training was assessed by the confusion matrix. For all actual activities, the trained model provided a high prediction (> 96 %). The trained model was used to deploy an Android app by using the TensorFlow API. Finally, three cell phones (LG gm730) were used to test the app and record the activity of six Holstein cows (3 lactating and 3 non-lactating). Direct and non-systematic observations of the animals were made to contrast the activities recorded by the device. Our results show consistency between the direct observations and the activity recorded by our Android app.