{"title":"面向全功能神经图的神经决策树","authors":"Han Xiao, Ge Xu","doi":"10.1142/s2301385020500132","DOIUrl":null,"url":null,"abstract":"Though the traditional algorithms could be embedded into neural architectures with the proposed principle of [H. Xiao, Hungarian layer: Logics empowered neural architecture, arXiv: 1712.02555], the variables that only occur in the condition of branch could not be updated as a special case. To tackle this issue, we multiply the conditioned branches with Dirac symbol (i.e., [Formula: see text]), then approximate Dirac symbol with the continuous functions (e.g., [Formula: see text]). In this way, the gradients of condition-specific variables could be worked out in the back-propagation process, approximately, making a fully functional neural graph. Within our novel principle, we propose the neural decision tree (NDT), which takes simplified neural networks as decision function in each branch and employs complex neural networks to generate the output in each leaf. Extensive experiments verify our theoretical analysis and demonstrate the effectiveness of our model.","PeriodicalId":164619,"journal":{"name":"Unmanned Syst.","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2020-02-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Neural Decision Tree Towards Fully Functional Neural Graph\",\"authors\":\"Han Xiao, Ge Xu\",\"doi\":\"10.1142/s2301385020500132\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Though the traditional algorithms could be embedded into neural architectures with the proposed principle of [H. Xiao, Hungarian layer: Logics empowered neural architecture, arXiv: 1712.02555], the variables that only occur in the condition of branch could not be updated as a special case. To tackle this issue, we multiply the conditioned branches with Dirac symbol (i.e., [Formula: see text]), then approximate Dirac symbol with the continuous functions (e.g., [Formula: see text]). In this way, the gradients of condition-specific variables could be worked out in the back-propagation process, approximately, making a fully functional neural graph. Within our novel principle, we propose the neural decision tree (NDT), which takes simplified neural networks as decision function in each branch and employs complex neural networks to generate the output in each leaf. Extensive experiments verify our theoretical analysis and demonstrate the effectiveness of our model.\",\"PeriodicalId\":164619,\"journal\":{\"name\":\"Unmanned Syst.\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2020-02-03\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Unmanned Syst.\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1142/s2301385020500132\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Unmanned Syst.","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1142/s2301385020500132","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Neural Decision Tree Towards Fully Functional Neural Graph
Though the traditional algorithms could be embedded into neural architectures with the proposed principle of [H. Xiao, Hungarian layer: Logics empowered neural architecture, arXiv: 1712.02555], the variables that only occur in the condition of branch could not be updated as a special case. To tackle this issue, we multiply the conditioned branches with Dirac symbol (i.e., [Formula: see text]), then approximate Dirac symbol with the continuous functions (e.g., [Formula: see text]). In this way, the gradients of condition-specific variables could be worked out in the back-propagation process, approximately, making a fully functional neural graph. Within our novel principle, we propose the neural decision tree (NDT), which takes simplified neural networks as decision function in each branch and employs complex neural networks to generate the output in each leaf. Extensive experiments verify our theoretical analysis and demonstrate the effectiveness of our model.