{"title":"Developing a framework for classification and/or recommendation","authors":"A. Tchangani, F. Pérés","doi":"10.1109/CoDIT49905.2020.9263795","DOIUrl":null,"url":null,"abstract":"The objective of this communication is to establish a framework for classifying or recommending an object characterized by several attributes into classes or for uses for which a nominal representative is known or some entry conditions are specified. In the case where classes are characterized by entry conditions, the mathematical problem to be solved is typically a constraints satisfaction problem. But in general, constraints are subject to uncertainty, so in this paper, we propose to transform these constraints into functions of membership or non-membership of fuzzy subsets; thus for each class, these functions, given an object to be classified or recommended, can be aggregated in synergy to give two measures: a measure of selectability of the class and a measure of rejectability; the final choice of the class is then made by optimizing an index based on these two measures. When classes are determined by a primary or main representative, the leader to whom the object to be classified should be compared, it seems natural to use measures of similarity or dissimilarity to classify the object in the right class. To do this, given that we consider that classes are characterized by normalized numerical indicators and therefore resemble a probabilistic structure, we propose to use Kullback-Leibler (KL) divergence that compares a given probability distribution to a main one as dissimilarity measure between an object and the representative of a class. The application of the approach developed to a real-world problem shows a certain potentiality.","PeriodicalId":355781,"journal":{"name":"2020 7th International Conference on Control, Decision and Information Technologies (CoDIT)","volume":"128 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-06-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 7th International Conference on Control, Decision and Information Technologies (CoDIT)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CoDIT49905.2020.9263795","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
The objective of this communication is to establish a framework for classifying or recommending an object characterized by several attributes into classes or for uses for which a nominal representative is known or some entry conditions are specified. In the case where classes are characterized by entry conditions, the mathematical problem to be solved is typically a constraints satisfaction problem. But in general, constraints are subject to uncertainty, so in this paper, we propose to transform these constraints into functions of membership or non-membership of fuzzy subsets; thus for each class, these functions, given an object to be classified or recommended, can be aggregated in synergy to give two measures: a measure of selectability of the class and a measure of rejectability; the final choice of the class is then made by optimizing an index based on these two measures. When classes are determined by a primary or main representative, the leader to whom the object to be classified should be compared, it seems natural to use measures of similarity or dissimilarity to classify the object in the right class. To do this, given that we consider that classes are characterized by normalized numerical indicators and therefore resemble a probabilistic structure, we propose to use Kullback-Leibler (KL) divergence that compares a given probability distribution to a main one as dissimilarity measure between an object and the representative of a class. The application of the approach developed to a real-world problem shows a certain potentiality.